{"id":126,"date":"2021-01-08T09:46:23","date_gmt":"2021-01-08T07:46:23","guid":{"rendered":"https:\/\/sites.uef.fi\/openar\/?page_id=126"},"modified":"2025-06-03T14:53:13","modified_gmt":"2025-06-03T11:53:13","slug":"background-information","status":"publish","type":"page","link":"https:\/\/sites.uef.fi\/openar\/background-information\/","title":{"rendered":"OpenAR 1.0 background information"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\" id=\"Light-traveling\">Light Traveling from an OLED Display to The Eyes<\/h2>\n\n\n\n<p>When it comes to optics, not everything is ideal in our system but, it does not really matter since our brain totally ignores most of these imperfections. All we needed to do was get it close enough. For this reason, the device is not meant to be worn for long periods. It would cause a headache. But after all, this was only meant to be a small demonstration of electronics, optics, programing and 3D printing.<br><\/p>\n\n\n\n<p>So, the image comes from a small OLED display and, in this case, it just shows distances in meters like \u201c1.40 m\u201d. Then, the light goes through a lens, reflects from a mirror, and then again partially reflects off the glass plates into the viewer\u2019s eyes. Sound extremely simple, but some crucial things are happening to the light during this journey.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Collimating Light<\/h2>\n\n\n\n<figure class=\"wp-block-image alignright size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1012\" height=\"567\" src=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Collimated_light.png\" alt=\"Collimated light\" class=\"wp-image-771\" style=\"width:237px;height:132px\" srcset=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Collimated_light.png 1012w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Collimated_light-300x168.png 300w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Collimated_light-768x430.png 768w\" sizes=\"auto, (max-width: 1012px) 100vw, 1012px\" \/><figcaption class=\"wp-element-caption\"><em>Light coming from the focal point of a convex lens is collimated after the lens.<\/em> <\/figcaption><\/figure>\n\n\n\n<p>The purpose of the lens is to collimate light that goes through it in order to get the same size image reflecting towards both eyes and to get rid of the double image from the front and rear surfaces of the glass plates. Collimation means that all the light rays are parallel after the lens. With a convex lens (preferably with a plano-convex lens), all the light that comes from the focal point of the lens is collimated.<\/p>\n\n\n\n<p>So, the display should be placed close to the focal point of the lens. It\u2019s best to have a mechanism to adjust this distance a little to find the ideal focus for your eyes.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Focal Length and Lens Diameter<\/h2>\n\n\n\n<p>The diameter of the lens affects to your field of vision. If you use a small diameter lens and\/or a lens with a short focal length, you\u2019ll only see a small part of the display.<\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"616\" height=\"298\" src=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Lens-size-and-focal-length-efect_2-e1611555494212.png\" alt=\"Lens size and focal length effect\" class=\"wp-image-975\" style=\"width:282px;height:135px\" srcset=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Lens-size-and-focal-length-efect_2-e1611555494212.png 616w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Lens-size-and-focal-length-efect_2-e1611555494212-300x145.png 300w\" sizes=\"auto, (max-width: 616px) 100vw, 616px\" \/><figcaption class=\"wp-element-caption\"><em>Eye can form images only from the light beams that reach the eye.<\/em><\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"895\" height=\"237\" src=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Lens-size-and-focal-length-efect_1-e1611555535622.png\" alt=\"Lens size and focal length effect\" class=\"wp-image-978\" style=\"width:526px;height:139px\" srcset=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Lens-size-and-focal-length-efect_1-e1611555535622.png 895w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Lens-size-and-focal-length-efect_1-e1611555535622-300x79.png 300w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Lens-size-and-focal-length-efect_1-e1611555535622-768x203.png 768w\" sizes=\"auto, (max-width: 895px) 100vw, 895px\" \/><figcaption class=\"wp-element-caption\"><em>With a longer focal length and wider diameter lens, you will get a larger view.<\/em><\/figcaption><\/figure>\n\n\n\n<p><\/p>\n<\/div>\n<\/div>\n\n\n\n<p>In these pictures, I\u2019m trying to demonstrate light traveling from two points (they could be two pixels at the sides of the display) to one\u2019s eye. With a small diameter lens that has a short focal length, one can\u2019t see the pixels at the sides. Of course, one would still see the pixels in the middle of the display. As a result, if you are planning on using a different kind of lens, remember the focal length and lens diameter will affect the area you can see.<\/p>\n\n\n\n<p>If you want to see how the light beams travel through this system, you can use a ray optics simulators. I like <a rel=\"noreferrer noopener\" href=\"https:\/\/ricktu288.github.io\/ray-optics\/simulator\/\" target=\"_blank\">this one<\/a>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Light Traveling Through the Glass Plates<\/h2>\n\n\n\n<figure class=\"wp-block-image alignright size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"908\" height=\"576\" src=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Mirror-reflection.png\" alt=\"Light reflecting from a mirror\" class=\"wp-image-969\" style=\"width:280px;height:177px\" srcset=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Mirror-reflection.png 908w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Mirror-reflection-300x190.png 300w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Mirror-reflection-768x487.png 768w\" sizes=\"auto, (max-width: 908px) 100vw, 908px\" \/><figcaption class=\"wp-element-caption\"><em>As we see from a mirror, the object seems to be in the direction from where the light arrives into our eyes. <\/em><\/figcaption><\/figure>\n\n\n\n<p>The ray of light approaching the glass surface is called the incident ray. The angle between the glass surface and the line perpendicular (or normal) to the surface of the glass is called the angle of incidence. According to <a rel=\"noreferrer noopener\" href=\"https:\/\/en.wikipedia.org\/wiki\/Reflection_(physics)\" target=\"_blank\">the law of reflection<\/a>, the angle of incidence is equal to the angle of reflection. In this case, the reflected rays are the ones that are going to hit your eye and eventually make you see the image. The image that you see appears to be in the direction where the light hits your eye. Probably simplest to demonstrate this with a mirror.<\/p>\n\n\n\n<p>Every air-to-glass and glass-to-air interface reflects about 4 % of the light. (To be precise, 4 % reflection happens only assuming perpendicular incidence angle (\u03b1=0\u00b0) and no absorption or scattering. In this case, the total reflected light from the glass plates is around 15-20 %, depending on the angle of incidence. For this application, we don\u2019t need to know the accurate proportion.)<\/p>\n\n\n\n<figure class=\"wp-block-image alignright size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"654\" height=\"581\" src=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Reflection.png\" alt=\"Light reflecting form a glass plate\" class=\"wp-image-828\" style=\"width:283px;height:252px\" srcset=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Reflection.png 654w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Reflection-300x267.png 300w\" sizes=\"auto, (max-width: 654px) 100vw, 654px\" \/><figcaption class=\"wp-element-caption\"><em>Light partly reflects from front and rear surface of a glass plate. The angle of incidence (\u03b1) is equal to the angle of reflection. The angle of refraction can be calculated based on <a href=\"https:\/\/en.wikipedia.org\/wiki\/Snell%27s_law\" target=\"_blank\" rel=\"noreferrer noopener\">Snell\u2019s law<\/a>.<\/em><\/figcaption><\/figure>\n\n\n\n<p>This means that both eyes receive two images, one from the front surface and one from the rear. This double reflection would naturally make you see double or make the image blur. However, by collimating the light, you will see only one sharp image for each eye. The lens in your eye can focus all collimated parallel rays to the same point. So, if you see a double-image with one eye, you need to adjust the distance between the lens and the OLED display.<\/p>\n\n\n\n<p>Most of the light will travel through the glass. In the picture, this pierced part is called an emergent ray. Then the light hits the second glass plate in front of the other eye. Again, similar partial reflection occurs. The light doesn\u2019t travel straight through the glass, but there is a small lateral displacement with the emergent ray compared to the incident ray. Since it\u2019s only about 0.33 mm, with 1 mm thick glass plates, you don\u2019t really need to think about this when building the headset.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"First-Surface-vs.-Second-Surface-Mirrors\">First Surface vs. Second Surface Mirrors<\/h2>\n\n\n\n<p>This double image problem is the reason why accurate optical devices have first surface mirrors. The first surface mirror has the reflecting layer, usually aluminum, on the top. Then all the light reflects from the first surface of the mirror, creating one accurate image. With a regular (second surface) mirror, the aluminum layer is behind some glass, so some light reflects from the first surface before reflecting from the aluminum. Since the first surface reflects as well, one gets a double image if the light is not collimated. With this application, it doesn\u2019t matter whether you have a first or second surface mirror if you have the lens before the mirror since the light will then be collimated. If you have the mirror before the lens, you will get a double image with a second surface mirror. So, if you want to change the setup, you might want to consider this. The good thing about regular mirrors is that it is much easier to clean them. Glass is easy to wipe clean whereas a single fingerprint can ruin a first surface mirror. If you wipe soft aluminum, all you get is a plate full of scratches.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Combining Separate Images to the Left and Right Eye<\/h2>\n\n\n\n<p>Getting rid of the double reflection from a single glass plate does not change the fact that the left and right eye get separate images. Combining these images is something your brain can do if the circumstances are correct, or at least close enough. Practically, this means that you have to adjust the angle of the glass plates. From our brain&#8217;s point of view, if the light does not seem to come from the same origin, it lets us see two images. You will just see two reflections from the OLED display floating in the air at some distance. By changing the angle of the reflecting plates, such that the light seems to travel from the same point to both eyes, your brain just decides to combine these images. I believe you will find this is a funny phenomenon when you try it. The images do not need to be that close for them to combine. Suddenly, they will just appear that way.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"3Dvision\">Human 3D Vision<\/h2>\n\n\n\n<figure class=\"wp-block-image alignright size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1021\" height=\"897\" src=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Convergence1.png\" alt=\"Convergence\" class=\"wp-image-961\" style=\"width:236px;height:207px\" srcset=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Convergence1.png 1021w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Convergence1-300x264.png 300w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Convergence1-768x675.png 768w\" sizes=\"auto, (max-width: 1021px) 100vw, 1021px\" \/><figcaption class=\"wp-element-caption\"><em>Eyes have to rotate inwards and adjust the lens of the eye in order to see object in a close distance.<\/em><\/figcaption><\/figure>\n\n\n\n<p>There are plenty of different mechanisms that enable our three-dimensional vision. These are divided into binocular and monocular cues. Binocular cues are based on us having two eyes about 6 \u2013 7 cm apart. Our AR-glasses system is using a binocular cue called binocular vergence for depth perception. Convergence refers to the simultaneous inward movement of both eyes toward each other. If you want to see a precise image, you have to look in the direction where the light comes from. If you look at something in a long-distance, your eyes are almost looking straight forward, and if you look at something close to you, the muscles in your eyes have to work to rotate your eyes inwards. From this muscle work and stretching of the muscles, our brain gets a quite precise idea of the location of the object. Of course, our eyes do a bit more than rotate in their holes. The lens in the eye changes its shape actively to bring images into focus. When you look far away, the lens is slim. When you look close up, the muscles in your eye make the lens thicker. This mechanism will only give us an idea of the distance of the image, though. It does not make us see a 3D image.<\/p>\n\n\n\n<figure class=\"wp-block-image alignright size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"981\" height=\"547\" src=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Stereopsis.png\" alt=\"Stereopsis demonstration\" class=\"wp-image-873\" style=\"width:339px;height:188px\" srcset=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Stereopsis.png 981w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Stereopsis-300x167.png 300w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Stereopsis-768x428.png 768w\" sizes=\"auto, (max-width: 981px) 100vw, 981px\" \/><figcaption class=\"wp-element-caption\"><em>Brain can combine two different images to one three-dimensional image. <\/em><\/figcaption><\/figure>\n\n\n\n<p>If you want to see a three-dimensional image, you need another type of binocular cue, stereopsis. If you look at some object, your left eye sees it from a slightly different angle than your right eye. That means you will receive two different kinds of images. Yet we \u201csee\u201d only one three-dimensional image, a combination of these two images, which is a product of our brain. To use this in AR-glasses you would need two displays, both showing slightly different images. Then you would have to find a way to bring these images separately to the left and the right eye.<\/p>\n\n\n\n<p>There are also many ways to estimate 3D shapes and distances with one eye only. They are called monocular cues. These can conclude, for example, from the object\u2019s size, shadows, contrast, perspective, etc. If you want to know more, here is one page with <a rel=\"noreferrer noopener\" href=\"https:\/\/uxdesign.cc\/human-eyes-understanding-of-space-for-augmented-reality-d5ce4d9fa37b\" target=\"_blank\">more information<\/a>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"ArduinoIDE\">Using Arduino IDE<\/h2>\n\n\n\n<p>Attach your Arduino Nano to your computer with a UBS wire. Copy the code you want to use and paste it to the coding window. Define your board by selecting Tools \u2192 Board \u2192 Arduino Nano. Select the processor Tools \u2192 Processor \u2192 ATmega328P. Select the port you are using in your computer Tools \u2192 Port \u2192 you\u2019ll get a list of options. Then click <em>Verify <\/em>. If you don\u2019t have all the libraries, you will get notification from that library. For example, \u201cAdafruit_SSD1306.h: No such file or directory\u201d. You can add a library selecting Sketch \u2192 Include library \u2192 Manage library. E.g. write &#8220;Adafruit_SSD1306.h&#8221; to the search field and install the missing library.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"614\" src=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Arduino-IDE-1024x614.png\" alt=\"Arduino IDE\" class=\"wp-image-945\" style=\"width:457px;height:274px\" srcset=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Arduino-IDE-1024x614.png 1024w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Arduino-IDE-300x180.png 300w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Arduino-IDE-768x460.png 768w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Arduino-IDE-1536x921.png 1536w, https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Arduino-IDE.png 1555w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\"><em>Adding libraries to Arduino IDE.<\/em><\/figcaption><\/figure>\n\n\n\n<p>Click <em>Verify<\/em> again. If no more libraries are missing, you can then send the program to your Arduino by clicking <em>Upload<\/em>. When the program says \u201cDone uploading,\u201d you are all set.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"Cutting-Glass-or-Mirror\">Cutting Glass or Mirror<\/h2>\n\n\n\n<p>Mind the sharp edges when handling glass or mirror plates. Use gloves and goggles. Make a cut across the plate with a glasscutter dipped in oil. You can use a ruler to make a straight line. Snap the glass at the cut by bending the glass plate. Smooth the edges carefully with sandpaper so that you don\u2019t have any sharp edges.<\/p>\n\n\n\n<p>Here is a good <a rel=\"noreferrer noopener\" href=\"https:\/\/www.youtube.com\/watch?v=goRcunMEF2I\" target=\"_blank\">video of cutting glass and mirror<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Light Traveling from an OLED Display to The Eyes When it comes to optics, not everything is ideal in our system but, it does not really matter since our brain totally ignores most of these imperfections. All we needed to do was get it close enough. For this reason, the device is not meant to [&hellip;]<\/p>\n","protected":false},"author":437,"featured_media":0,"parent":0,"menu_order":2,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"footnotes":""},"class_list":["post-126","page","type-page","status-publish","hentry"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>OpenAR 1.0 background information - OpenAR<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/sites.uef.fi\/openar\/background-information\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"OpenAR 1.0 background information - OpenAR\" \/>\n<meta property=\"og:description\" content=\"Light Traveling from an OLED Display to The Eyes When it comes to optics, not everything is ideal in our system but, it does not really matter since our brain totally ignores most of these imperfections. All we needed to do was get it close enough. For this reason, the device is not meant to [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/sites.uef.fi\/openar\/background-information\/\" \/>\n<meta property=\"og:site_name\" content=\"OpenAR\" \/>\n<meta property=\"article:modified_time\" content=\"2025-06-03T11:53:13+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Collimated_light.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1012\" \/>\n\t<meta property=\"og:image:height\" content=\"567\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"10 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/background-information\\\/\",\"url\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/background-information\\\/\",\"name\":\"OpenAR 1.0 background information - OpenAR\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/background-information\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/background-information\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/wp-content\\\/uploads\\\/sites\\\/263\\\/2021\\\/01\\\/Collimated_light.png\",\"datePublished\":\"2021-01-08T07:46:23+00:00\",\"dateModified\":\"2025-06-03T11:53:13+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/background-information\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/background-information\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/background-information\\\/#primaryimage\",\"url\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/wp-content\\\/uploads\\\/sites\\\/263\\\/2021\\\/01\\\/Collimated_light.png\",\"contentUrl\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/wp-content\\\/uploads\\\/sites\\\/263\\\/2021\\\/01\\\/Collimated_light.png\",\"width\":1012,\"height\":567,\"caption\":\"Light coming from the focal point of a convex lens is collimated after the lens.\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/background-information\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"OpenAR 1.0 background information\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/#website\",\"url\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/\",\"name\":\"OpenAR\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/sites.uef.fi\\\/openar\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"OpenAR 1.0 background information - OpenAR","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/sites.uef.fi\/openar\/background-information\/","og_locale":"en_US","og_type":"article","og_title":"OpenAR 1.0 background information - OpenAR","og_description":"Light Traveling from an OLED Display to The Eyes When it comes to optics, not everything is ideal in our system but, it does not really matter since our brain totally ignores most of these imperfections. All we needed to do was get it close enough. For this reason, the device is not meant to [&hellip;]","og_url":"https:\/\/sites.uef.fi\/openar\/background-information\/","og_site_name":"OpenAR","article_modified_time":"2025-06-03T11:53:13+00:00","og_image":[{"width":1012,"height":567,"url":"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Collimated_light.png","type":"image\/png"}],"twitter_card":"summary_large_image","twitter_misc":{"Est. reading time":"10 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/sites.uef.fi\/openar\/background-information\/","url":"https:\/\/sites.uef.fi\/openar\/background-information\/","name":"OpenAR 1.0 background information - OpenAR","isPartOf":{"@id":"https:\/\/sites.uef.fi\/openar\/#website"},"primaryImageOfPage":{"@id":"https:\/\/sites.uef.fi\/openar\/background-information\/#primaryimage"},"image":{"@id":"https:\/\/sites.uef.fi\/openar\/background-information\/#primaryimage"},"thumbnailUrl":"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Collimated_light.png","datePublished":"2021-01-08T07:46:23+00:00","dateModified":"2025-06-03T11:53:13+00:00","breadcrumb":{"@id":"https:\/\/sites.uef.fi\/openar\/background-information\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/sites.uef.fi\/openar\/background-information\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/sites.uef.fi\/openar\/background-information\/#primaryimage","url":"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Collimated_light.png","contentUrl":"https:\/\/sites.uef.fi\/openar\/wp-content\/uploads\/sites\/263\/2021\/01\/Collimated_light.png","width":1012,"height":567,"caption":"Light coming from the focal point of a convex lens is collimated after the lens."},{"@type":"BreadcrumbList","@id":"https:\/\/sites.uef.fi\/openar\/background-information\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/sites.uef.fi\/openar\/"},{"@type":"ListItem","position":2,"name":"OpenAR 1.0 background information"}]},{"@type":"WebSite","@id":"https:\/\/sites.uef.fi\/openar\/#website","url":"https:\/\/sites.uef.fi\/openar\/","name":"OpenAR","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/sites.uef.fi\/openar\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"}]}},"_links":{"self":[{"href":"https:\/\/sites.uef.fi\/openar\/wp-json\/wp\/v2\/pages\/126","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sites.uef.fi\/openar\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sites.uef.fi\/openar\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sites.uef.fi\/openar\/wp-json\/wp\/v2\/users\/437"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.uef.fi\/openar\/wp-json\/wp\/v2\/comments?post=126"}],"version-history":[{"count":2,"href":"https:\/\/sites.uef.fi\/openar\/wp-json\/wp\/v2\/pages\/126\/revisions"}],"predecessor-version":[{"id":2048,"href":"https:\/\/sites.uef.fi\/openar\/wp-json\/wp\/v2\/pages\/126\/revisions\/2048"}],"wp:attachment":[{"href":"https:\/\/sites.uef.fi\/openar\/wp-json\/wp\/v2\/media?parent=126"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}