{"id":991,"date":"2017-09-07T16:02:43","date_gmt":"2017-09-07T16:02:43","guid":{"rendered":"https:\/\/blogs.mathworks.com\/headlines\/?p=991"},"modified":"2021-11-21T12:58:59","modified_gmt":"2021-11-21T17:58:59","slug":"a-unique-robot-takes-home-big-prize-in-the-amazon-robotics-challenge","status":"publish","type":"post","link":"https:\/\/blogs.mathworks.com\/headlines\/2017\/09\/07\/a-unique-robot-takes-home-big-prize-in-the-amazon-robotics-challenge\/","title":{"rendered":"A unique robot takes home big prize in the Amazon Robotics Challenge"},"content":{"rendered":"<p>Amazon is changing retail as we know it, and this change is built on efficiency. With more than\u00a0<a href=\"https:\/\/www.geekwire.com\/2017\/true-cost-convenience-amazons-annual-shipping-losses-top-7b-first-time\/\" target=\"_blank\" rel=\"noopener\">50 million items<\/a>\u00a0now eligible for free 2-day shipping through their Prime program, the pressure is on. The company is continually looking for ways to reduce the time from when the order is placed to when it&#8217;s delivered to the customer.<\/p>\n<p>&nbsp;<\/p>\n<p><div style=\"width: 510px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/static.seattletimes.com\/wp-content\/uploads\/2017\/08\/72013576-6b5f-11e7-85c1-36a5a028fa1a-1020x607.jpg\" target=\"_blank\" rel=\"noopener\"><img decoding=\"async\" loading=\"lazy\" class=\"\" src=\"https:\/\/static.seattletimes.com\/wp-content\/uploads\/2017\/08\/72013576-6b5f-11e7-85c1-36a5a028fa1a-1020x607.jpg\" width=\"500\" height=\"297\" \/><\/a><p class=\"wp-caption-text\">Robots at the Amazon Robotics research and production facility in North Reading, Massachusetts. Image credit: Ian MacLellan \/ The Seattle Times.<\/p><\/div><\/p>\n<p>&nbsp;<\/p>\n<p>Technology is key to this quest, from Amazon\u2019s\u00a0<a href=\"https:\/\/www.theverge.com\/2017\/3\/24\/15047424\/amazon-prime-air-drone-delivery-public-us-test-mars\" target=\"_blank\" rel=\"noopener\">first drone delivery<\/a>\u00a0earlier this year, to their\u00a0<a href=\"http:\/\/www.seattletimes.com\/business\/amazon\/amazons-army-of-robots-job-destroyers-or-dance-partners\/\" target=\"_blank\" rel=\"noopener\">100,000 robots<\/a>\u00a0in their automated fulfillment centers. But there is one part of the process that has proven difficult to automate.<\/p>\n<p>According to\u00a0<em><a href=\"https:\/\/www.bloomberg.com\/news\/articles\/2017-07-27\/amazon-enlists-researchers-to-build-box-packing-robots\" target=\"_blank\" rel=\"noopener\">Bloomberg<\/a><\/em>, \u201cThe company has a\u00a0fleet of robots that drive around its facilities\u00a0gathering items for orders. But it needs humans for the last step\u00a0\u2014 picking up items of various shapes, then packing the right ones into the correct boxes for shipping. It\u2019s a classic example of an activity that\u2019s simple, almost mindless, for\u00a0humans, but still unattainable\u00a0for robots.\u201d<\/p>\n<h2>Amazon Robotics Challenge<\/h2>\n<p>As one approach to solve this \u201cpick and place\u201d challenge, Amazon Robotics has sponsored a robotics competition for the past three years. Each team participating must design a robot that identifies objects, grasps them, and then safely packs them in boxes for shipment.<\/p>\n<blockquote><p>\u201cAmazon&#8217;s automated warehouses are successful at removing much of the walking and searching for items within a warehouse. However, commercially viable automated picking in unstructured environments still remains a difficult challenge&#8230; In order to spur the advancement of these fundamental technologies, Amazon Robotics organizes the Amazon Robotics Challenge (ARC).\u201d<\/p>\n<p>&#8212;\u00a0<a title=\"https:\/\/www.amazonrobotics.com\/#\/roboticschallenge (now moved)\" target=\"_blank\" rel=\"noopener\">Amazon Robotics<\/a>.<\/p><\/blockquote>\n<p>The 2017 contest was designed to be more challenging than years past. This year, the competing robots couldn\u2019t be \u201cpre-programmed\u201d with all the items they\u2019d need to select from. Amazon Robotics gave the teams 40 objects to train their systems, and then replaced 20 of them with new items 45 minutes before the actual competition. To match the increase in difficulty, Amazon Robotics increased the total prize money to $250,000.<\/p>\n<p>The task required computer vision-based algorithms to identify objects and plan the correct grasp.\u00a0<em>Bloomberg<\/em>\u00a0explained how the teams accomplished this feat, \u201cThey now use\u00a0neural networks, a form of\u00a0artificial intelligence that helps robots learn to recognize objects with less human programming.\u201d<\/p>\n<p>The teams had to teach the robots how to see a collection of objects and correctly select the item on the \u201cshopping list\u201d, pick it up and place it in the box. It\u2019s not as easy as it sounds: picking up a soft item such as a teddy bear requires a much different grasp than picking up a book. And the robot needs to know what to do if the teddy bear is buried under the other objects in the collection.<\/p>\n<h2>The 2017 ARC champions: Australian Centre for Robotic Vision<\/h2>\n<p>Many teams entered industrial arm robots in the contest, adding grippers to the arms. The teams taught them to pick up objects and pack them much like a human would. However, the Australian Centre for Robotic Vision (<a href=\"https:\/\/www.roboticvision.org\/\" target=\"_blank\" rel=\"noopener\">ACRV<\/a>) won the competition with a robot that was dramatically different than past winners. They replaced the robotic arm with a\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Cartesian_coordinate_robot\" target=\"_blank\" rel=\"noopener\">Cartesian coordinate robot<\/a>\u00a0that looked more like a claw arcade game than a typical industrial arm robot. Their robot, Cartman, used suction cups and a two-fingered claw to grasp and manipulate the items.<\/p>\n<p>&nbsp;<\/p>\n<p><div style=\"width: 510px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/spectrum.ieee.org\/image\/MjkzNTA3Ng.jpeg\" target=\"_blank\" rel=\"noopener\"><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/spectrum.ieee.org\/image\/MjkzNTA3Ng.jpeg\" width=\"500\" height=\"333\" \/><\/a><p class=\"wp-caption-text\">Peter Corke, director of ACRV, and team with Cartman. Image Credit: Anthony Weate, QUT.<\/p><\/div><\/p>\n<p>&nbsp;<\/p>\n<p>Cartman came in first in the final challenge which required the robot to first stow items, then to pick and pack selected items into boxes. The ARCV team, comprised of engineers from Queensland University of Technology (QUT), the University of Adelaide and the Australian National University, took home the $80,000 grand prize.<\/p>\n<p>In addition to the highest score, Cartman was also the least expensive robot entered in the competition. Its final cost was under $24,000, significantly less expensive to build than most industrial robots. It was made of off-the-shelf products, and even made use of the engineer\u2019s go-to construction aide: zip ties!<\/p>\n<p>&nbsp;<\/p>\n<p><div style=\"width: 510px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/3c1703fe8d.site.internapcdn.net\/newman\/gfx\/news\/hires\/2017\/willthisauss.jpg\" target=\"_blank\" rel=\"noopener\"><img decoding=\"async\" loading=\"lazy\" class=\"\" src=\"https:\/\/3c1703fe8d.site.internapcdn.net\/newman\/gfx\/news\/hires\/2017\/willthisauss.jpg\" width=\"500\" height=\"333\" \/><\/a><p class=\"wp-caption-text\">The ACRV team with Cartman. Image Credit: Anthony Weate, QUT.<\/p><\/div><\/p>\n<p>&nbsp;<\/p>\n<h2>Cartman uses deep learning to ID items<\/h2>\n<p>\u201cThe first approach to the\u00a0vision\u00a0perception\u00a0we tried was a two-stage approach: unsupervised segmentation followed by classification using deep features,\u201d stated Trung Thanh Pham, ARC Postdoctoral Research Fellow at ACRV.<\/p>\n<p><a href=\"https:\/\/www.mathworks.com\/products\/matlab.html\" target=\"_blank\" rel=\"noopener\">MATLAB<\/a>\u00a0and\u00a0<a href=\"https:\/\/www.mathworks.com\/matlabcentral\/fileexchange\/47811-vlfeat-matconvnet?s_tid=srchtitle\" target=\"_blank\" rel=\"noopener\">MatConvNet<\/a>\u00a0were used to test the idea. MatConvNet is a MATLAB toolbox used implement Convolutional Neural Networks (CNNs) for computer vision applications. The\u00a0<a href=\"https:\/\/www.mathworks.com\/products\/image.html\" target=\"_blank\" rel=\"noopener\">Image Processing Toolbox<\/a>\u00a0was used for I\/O manipulations and visualizations, as well.<\/p>\n<p>\u201cOur final vision system used a deep CNN called\u00a0<a href=\"https:\/\/arxiv.org\/abs\/1611.06612\" target=\"_blank\" rel=\"noopener\">RefineNet<\/a>, which performs pixel-wise semantic segmentation,\u201d added Douglas Morrison,\u00a0Ph.D.\u00a0researcher at QUT. \u201cOne of our system&#8217;s main advantages was\u00a0RefineNet&#8217;s\u00a0ability to provide accurate results with a small amount of training data. Our base system was first trained on ~200 images of the known items in cluttered environments (10-20 items per image). At competition time, our training approach was a trade-off between data collection and training time. We found that by adding 7 images of each new item in different poses was sufficient to consistently identify that item while still leaving sufficient training time.\u201d<\/p>\n<p>The data collection process was as automated to decrease the time required, placing the new items two at a time into an empty tote and using background subtraction to automatically generate a new\u00a0labelled\u00a0dataset of images. The team was able to collect all of the new data in approximately 5-7 minutes, leaving the remainder of the time for fine-tuning the network on the new data.<\/p>\n<h2>Cartman was the only robot to successfully complete the final stage<\/h2>\n<p>Cartman did suffer a setback in the second phase of the competition. Cartman had slipped to fifth place when it dropped an item after taking it out of the tote. But the systems overall ability to detect errors and adjust appropriately helped the team win the final phase. Throughout the competition, the team practiced continually and added improvements to the system.<\/p>\n<p>\u201cOne such feature that we had added was the ability for the Cartman to search for items that he couldn&#8217;t see by moving other items out of the way and between parts of the storage system,\u201d stated Morrison. \u201cThis feature ended up being crucial to our win in the finals task, as the final item was buried at the bottom of a storage bin, and all of the items on top had to be moved before it was visible. As a result, we were the only team to complete the pick phase of the finals task by placing all &#8220;ordered&#8221; items into the cardboard boxes.\u201d<\/p>\n<p>Congratulations ACRV!<\/p>\n","protected":false},"excerpt":{"rendered":"<div class=\"overview-image\"><!-- Featured Image From URL plugin --> <img decoding=\"async\" src=\"https:\/\/spectrum.ieee.org\/image\/MjkzNTA3Ng.jpeg\" alt=\"\" style=\"\"><\/div>\n<p>Amazon is changing retail as we know it, and this change is built on efficiency. With more than\u00a050 million items\u00a0now eligible for free 2-day shipping through their Prime program, the pressure is on&#8230;. <a class=\"read-more\" href=\"https:\/\/blogs.mathworks.com\/headlines\/2017\/09\/07\/a-unique-robot-takes-home-big-prize-in-the-amazon-robotics-challenge\/\">read more >><\/a><\/p>\n","protected":false},"author":138,"featured_media":-1,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts\/991"}],"collection":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/users\/138"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/comments?post=991"}],"version-history":[{"count":5,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts\/991\/revisions"}],"predecessor-version":[{"id":3309,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts\/991\/revisions\/3309"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/"}],"wp:attachment":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/media?parent=991"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/categories?post=991"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/tags?post=991"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}