{"id":680,"date":"2018-01-17T01:23:48","date_gmt":"2018-01-17T00:23:48","guid":{"rendered":"https:\/\/blogs.mathworks.com\/student-lounge\/?p=680"},"modified":"2019-04-01T14:31:17","modified_gmt":"2019-04-01T12:31:17","slug":"robocupathome-education-workshop","status":"publish","type":"post","link":"https:\/\/blogs.mathworks.com\/student-lounge\/2018\/01\/17\/robocupathome-education-workshop\/","title":{"rendered":"Preparing for the RoboCup Major Leagues"},"content":{"rendered":"<p>The Racing Lounge is back from Christmas break. I wish you all a succesful and healthy 2018!<br \/>\nIn today&#8217;s post,\u00a0<a href=\"https:\/\/www.mathworks.com\/matlabcentral\/profile\/authors\/3069683-sebastian-castro\">Sebastian Castro<\/a>\u00a0discusses his experiences with a robotics workshop he helped deliver\u00a0at the\u00a0RoboCup Asia-Pacific\u00a0(RCAP) event in Bangkok, Thailand.<\/p>\n<p>As a reminder,\u00a0MathWorks is a global sponsor of RoboCup which comes with <a href=\"https:\/\/www.mathworks.com\/academia\/student-competitions\/robocup.html\">many benefits to student teams<\/a>.\u00a0At RCAP, we\u00a0had a booth with information, giveaways, software demonstrations, and more. We were even joined by our distributors in Southeast Asia, <a href=\"https:\/\/www.techsource-asia.com\/\">TechSource<\/a>. Next time you&#8217;re at a RoboCup event, please stop by and say &#8220;hi&#8221;.<\/p>\n<p style=\"text-align: center;\"><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-726 size-full\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/mathworks_booth-e1515072633874.jpg\" width=\"667\" height=\"500\" \/><em>MathWorks and TechSource at our booth<\/em><\/p>\n<h1>Background<\/h1>\n<p>One of the biggest challenges in RoboCup is the jump from the Junior (pre-university) leagues to the Major (university-level) leagues. Typically, there is a significant learning curve that stops even the most successful Junior teams from continuing in RoboCup once they move from high school to university.<\/p>\n<p>Many RoboCup organizers are aware of this issue, which had led them to create intermediate challenges targeted at overcoming this learning curve. Unlike the Major league competitions, which are primarily research platforms for students pursuing advanced degrees, these challenges are more suited for educating undergraduate students. Some examples are the\u00a0<a href=\"http:\/\/oarkit.intelligentrobots.org\/home\/the-arena\/\">RoboCupRescue Rapidly Manufactured Robot<\/a>\u00a0and the <a href=\"http:\/\/cospacerobot.org\/competition\">RoboCup Asia-Pacific CoSpace<\/a>\u00a0challenges.<\/p>\n<p>What made\u00a0RCAP extra special for us was\u00a0being invited\u00a0by\u00a0Dr. Jeffrey Tan\u00a0to help with another challenge of this type. Dr. Tan has been\u00a0an organizer with the <a href=\"http:\/\/www.robocupathome.org\/\">RoboCup@Home<\/a> league for over 4 years, as well as an advisor for the <a href=\"http:\/\/openbotics.org\/kamerider\/index.php?title=Main_Page\">KameRider team<\/a>. We\u00a0decided to\u00a0deliver a joint workshop for his\u00a0<a href=\"http:\/\/www.robocupathomeedu.org\/\">RoboCup@Home Education<\/a> league. This was a great opportunity for us because it allowed us to introduce MATLAB into various aspects of robot design and programming, and served as a good benchmark of our tools in a real, world-class competition.<\/p>\n<p>There were\u00a04 teams participating:<\/p>\n<ul>\n<li><strong>Tamagawa Science Club<\/strong>\u00a0&#8212; Tamagawa Academy, Japan &#8212; high school<\/li>\n<li><strong>Nalanda<\/strong>\u00a0&#8212; Genesis Global School, India &#8212; high school<\/li>\n<li><strong>KameRider EDU<\/strong>\u00a0&#8212; Nankai University, China\/Universiti Teknology Malaysia &#8212; undergraduate<\/li>\n<li><strong>Skuba JR<\/strong>\u00a0&#8212; Kasetsart University, Thailand &#8212; undergraduate<\/li>\n<\/ul>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-696 size-full\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/WorkshopPic-e1515007709965.jpg\" width=\"916\" height=\"500\" \/><\/p>\n<p style=\"text-align: center;\"><em>RoboCup@Home Education Crew &#8212; Courtesy of Dr. Kanjanapan Sukvichai (Skuba JR advisor)<\/em><\/p>\n<h1>Days 1-2: The Workshop<\/h1>\n<p>For\u00a0the first 2 days, we came up with an ambitious curriculum of topics that got the students up and running with a TurtleBot2 from scratch &#8212; including an <a href=\"http:\/\/wiki.ros.org\/openni_camera\">RGB + depth sensor<\/a> and a <a href=\"http:\/\/wiki.ros.org\/turtlebot_arm\">robot arm<\/a>.<\/p>\n<p>All\u00a0we requested in advance was to bring a laptop with Ubuntu 14.04. We then installed the <a href=\"http:\/\/www.ros.org\/\">Robot Operating System (ROS)<\/a> and MATLAB with a <a href=\"https:\/\/www.mathworks.com\/academia\/student-competitions\/robocup.html\">complimentary license from our Web site<\/a>.<\/p>\n<p>The goal\u00a0was to develop the pieces of a typical RoboCup@Home algorithm. If\u00a0the robot has a map of its environment\u00a0and receives a spoken command &#8212; for example, &#8220;<em>bring me the water bottle from the kitchen<\/em>&#8221; &#8212; the diagram below\u00a0is\u00a0an example of\u00a0the components needed to complete that task.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-698 size-full\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/AlgorithmSchematic-e1515006827770.png\" width=\"849\" height=\"400\" \/><\/p>\n<p>&nbsp;<\/p>\n<h2>A. Speech Recognition and Synthesis<\/h2>\n<p>Speech recognition was done with <a href=\"http:\/\/wiki.ros.org\/pocketsphinx\">CMU PocketSphinx<\/a>, while speech input and synthesis was done with the <a href=\"http:\/\/wiki.ros.org\/audio_common\">audio_common stack<\/a>. In the workshop, we showed how to detect speech, look for key words in a dictionary, and take actions based on those key words.\u00a0This was all done outside of MATLAB.<\/p>\n<p>Several students asked about MATLAB&#8217;s capabilities for speech recognition. Right now, there are two ways to get this\u00a0working:<\/p>\n<ol>\n<li>Use the ROS tools above to publish the detected\u00a0text on a ROS topic and\u00a0<a href=\"https:\/\/www.mathworks.com\/help\/robotics\/ref\/robotics.subscriber.html\">subscribe to\u00a0it in MATLAB<\/a>.<\/li>\n<li><a href=\"https:\/\/www.mathworks.com\/help\/matlab\/matlab_external\/call-user-defined-custom-module.html\">Call a user-defined Python speech module\u00a0from MATLAB<\/a>.<\/li>\n<\/ol>\n<p>Once the text is in MATLAB, you can take advantage of\u00a0its <a href=\"https:\/\/www.mathworks.com\/help\/matlab\/characters-and-strings.html\">capabilities for characters and strings<\/a>, or even the new <a href=\"https:\/\/www.mathworks.com\/products\/text-analytics.html\">Text Analytics Toolbox<\/a>.<\/p>\n<p>I have personally got\u00a0approach #2 working with this\u00a0<a href=\"https:\/\/pypi.python.org\/pypi\/SpeechRecognition\/\">Python SpeechRecognition package<\/a> &#8212;\u00a0in particular, with\u00a0Google Cloud Speech and CMU PocketSphinx. The image below shows a simple example I ran which used Text Analytics Toolbox to cluster my speech into two categories &#8212; food and drink. There are words like &#8220;going&#8221;, &#8220;have&#8221;, and &#8220;some&#8221; which give us no extra information. Luckily, the toolbox has <a href=\"https:\/\/www.mathworks.com\/help\/textanalytics\/examples\/prepare-text-data-for-analysis.html\">preprocessing capabilities<\/a> to address such issues.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-692 size-full\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/textAnalytics-e1515007046471.png\" width=\"626\" height=\"500\" \/><\/p>\n<h2>B. Mapping and Navigation<\/h2>\n<p>To perform mapping and navigation tasks, we took the following workflow<\/p>\n<ol>\n<li>Generate a map of the environment\u00a0using the <a href=\"http:\/\/wiki.ros.org\/turtlebot_navigation\/Tutorials\/Build%20a%20map%20with%20SLAM\">existing TurtleBot\u00a0gmapping example<\/a>\u00a0and <a href=\"http:\/\/wiki.ros.org\/turtlebot_teleop\">driving the robot around<\/a>.<\/li>\n<li>In the example above, the latest map is published on a ROS topic (\/map). So, we can read the map into MATLAB as an <a href=\"https:\/\/www.mathworks.com\/help\/robotics\/ug\/occupancy-grids.html\">occupancy grid<\/a>\u00a0and save it to a file.<\/li>\n<li>Once\u00a0the map is\u00a0in MATLAB, we can sample a <a href=\"https:\/\/www.mathworks.com\/help\/robotics\/ug\/probabilistic-roadmaps-prm.html\">probabilistic roadmap (PRM)<\/a>\u00a0and use it to find a path between two points.<\/li>\n<li>Then, we can program a robot to\u00a0follow this path with the <a href=\"https:\/\/www.mathworks.com\/help\/robotics\/ug\/pure-pursuit-controller.html\">Pure Pursuit algorithm<\/a>.<\/li>\n<\/ol>\n<p>Below you can see an\u00a0example map and\u00a0path I generated\u00a0near my office. Assuming the map is static, steps 3 and 4 can be repeated for different start and goal points as needed.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-700 size-full\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/MapNavigation-e1515006509561.png\" width=\"699\" height=\"600\" \/><\/p>\n<h2>C. Computer Vision and Control<\/h2>\n<p>For this task, we operated exclusively in the MATLAB world. We received many comments about prototyping with images being easier than with <a href=\"https:\/\/opencv.org\/\">OpenCV<\/a>, mostly because more challenging languages (Python or C++) were required for the latter.<\/p>\n<p>Our vision and control workflow was:<\/p>\n<ol>\n<li>Using the <a href=\"https:\/\/www.mathworks.com\/help\/images\/image-segmentation-using-the-color-thesholder-app.html\">Color Thresholder app<\/a> to\u00a0define thresholds for tracking\u00a0an object of interest<\/li>\n<li>Performing <a href=\"https:\/\/www.mathworks.com\/help\/vision\/ref\/blobanalysis.html\">blob analysis<\/a>\u00a0to\u00a0find the object\u00a0position<\/li>\n<li>Estimating\u00a0the distance\u00a0object using the depth image at the detected object\u00a0position<\/li>\n<li>Moving the robot depending on the object\u00a0position and depth. We started with simple on-off controllers with deadband on both linear and angular velocity.<\/li>\n<\/ol>\n<p>At this point, students had reference MATLAB code for a closed-loop vision-based controller with ROS. Over the next few days, they were encouraged to modify this code\u00a0to make their robots track objects more robustly. Let&#8217;s\u00a0remember that most of the students had never been exposed to MATLAB before!<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-706 size-large\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/VisionAlgo-1-1024x389.png\" width=\"1024\" height=\"389\" \/><\/p>\n<h2>D. Manipulation<\/h2>\n<p>The robots were fitted with <a href=\"http:\/\/wiki.ros.org\/turtlebot_arm\">TurtleBot arms<\/a>. There were two sides to getting these arms working: hardware and software.<\/p>\n<p>On the hardware side, we pointed students to the <a href=\"http:\/\/wiki.ros.org\/dynamixel_controllers\/Tutorials\">ROBOTIS Dynamixel servo ROS tutorials<\/a>. The goal here was to make sure there was a ROS interface to joint position controllers for each of the motors in the robot arm. This would allow us to control the arms with controllers in MATLAB.<\/p>\n<p>On the software side, the steps were:<\/p>\n<ol>\n<li><a href=\"https:\/\/www.mathworks.com\/help\/robotics\/ref\/importrobot.html\">Import the robot arm description (URDF) file<\/a> into MATLAB as a <a href=\"https:\/\/www.mathworks.com\/help\/robotics\/ug\/rigid-body-tree-robot-model.html\">rigid body tree\u00a0representation<\/a><\/li>\n<li>Become familiar with the <a href=\"https:\/\/www.mathworks.com\/help\/robotics\/ug\/inverse-kinematics-algorithms.html\">Inverse Kinematics (IK)<\/a> functionality in the Robotics System Toolbox<\/li>\n<li>Follow a path\u00a0by using IK at several points &#8212; first in simulation and then on the real robot arm<\/li>\n<\/ol>\n<p>The figure below shows a path that linearly interpolates between points. However, smoother trajectories are also possible with a little more math, or with tools\u00a0like the <a href=\"https:\/\/www.mathworks.com\/products\/curvefitting.html\">Curve Fitting Toolbox<\/a>.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-704 size-full\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/ManipMotion-e1515006796360.png\" width=\"509\" height=\"500\" \/><\/p>\n<h1>Days 3-5: The\u00a0Competition<\/h1>\n<p>The workshop days were meant to raise awareness of software tools needed to succeed in the competition.\u00a0With a stack of ROS tutorials, example MATLAB files, and other useful links, the students were now tasked with taking our reference applications and using them to compete in the same challenges as the major leagues.<\/p>\n<p>These challenges were as follows. Credit to Dr. Tan&#8217;s KameRider team for the sample YouTube videos.<\/p>\n<ol>\n<li><strong>Speech and Person Recognition:<\/strong> Demonstrating basic speech and vision functionality. An example would be asking the robot &#8220;how many people are in front of you?&#8221; and getting a correct answer. [<a href=\"https:\/\/www.youtube.com\/watch?v=dv3RUpUtje4\">Person Video<\/a>] [ <a href=\"https:\/\/www.youtube.com\/watch?v=-HV5sv0scp8\">Speech Video<\/a>]<\/li>\n<li><strong>Help-me-carry:<\/strong> Following a person and assisting them by carrying an object. [<a href=\"https:\/\/www.youtube.com\/watch?v=CMarj1DT9t0\">Help-me-carry Video<\/a>]<\/li>\n<li><strong>Restaurant:<\/strong> Identifying a person ready to place an order, correctly listening to the order, and retrieving the object that has been ordered [<a href=\"https:\/\/www.youtube.com\/watch?v=HJMo0bwGFC8\">Restaurant Video<\/a>] [<a href=\"https:\/\/www.youtube.com\/watch?v=PSTJY2yncWM\">Manipulation Video<\/a>]<\/li>\n<li><strong>Finals:<\/strong> Teams can choose freely what to demonstrate, and are evaluated on criteria such as novelty, scientific contribution, presentation, and performance<\/li>\n<\/ol>\n<p>During this time, students were spending hours digesting workshop material, testing their code, and slowly building up algorithms to clear the challenges.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-780 size-full\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/workshop_tamagawa_nav-1-e1515609547561.jpg\" width=\"910\" height=\"500\" \/><\/p>\n<p style=\"text-align: center;\"><em>Tamagawa Science Club robot during the competition &#8212; Courtesy of Dr. Jeffrey Tan<\/em><\/p>\n<p>Some highlights:<\/p>\n<ul>\n<li>Teams\u00a0experienced firsthand how to take pieces of code that performed different tasks and integrate them into one working system.<\/li>\n<li>All teams had\u00a0complete computer vision and control algorithms with MATLAB, but\u00a0not all teams had functioning speech detection\/synthesis and manipulator control. They found the MATLAB code easier to set up and modify than some of the other ROS based packages, which required knowledge of Python, C++, and\/or the <a href=\"http:\/\/wiki.ros.org\/catkin\">Catkin build system<\/a> to use and modify.<\/li>\n<li>The\u00a0two high school teams\u00a0successfully generated a map of the environment and implemented a navigation algorithm.<\/li>\n<li>The Nalanda team was able to add obstacle avoidance to their\u00a0robot using\u00a0the <a href=\"https:\/\/www.mathworks.com\/help\/robotics\/ug\/vector-field-histograms.html\">Vector Field Histogram<\/a> functionality in Robotics System Toolbox.<\/li>\n<li>A handful of\u00a0teams were able to tune some\u00a0of the\u00a0pretrained <a href=\"https:\/\/www.mathworks.com\/help\/vision\/ref\/vision.cascadeobjectdetector-system-object.html\">cascade object detector<\/a>\u00a0and <a href=\"https:\/\/www.mathworks.com\/help\/vision\/ref\/vision.peopledetector-system-object.html\">people detector<\/a>\u00a0examples for their person recognition and final challenges.<\/li>\n<\/ul>\n<p><img decoding=\"async\" loading=\"lazy\" width=\"1024\" height=\"477\" class=\"aligncenter size-large wp-image-778\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/workshop_person_recognition-1024x477.jpg\" alt=\"\" \/><\/p>\n<p style=\"text-align: center;\"><em>KameRider EDU during the Person Recognition challenge &#8212; Courtesy of Dr. Jeffrey Tan<\/em><\/p>\n<h1>Conclusion<\/h1>\n<p>This event was a lot of fun, and as a bonus I enjoyed escaping the cold Boston winter for a few weeks. It was satisfying to see how much the students were able to achieve, and how our conversations evolved, in such a short time frame.<\/p>\n<ul>\n<li>At the beginning, it was mostly questions about installation, error messages, basic MATLAB and ROS questions, and &#8220;how am I going to get all of this done?&#8221;.<\/li>\n<li>Near the end, students\u00a0had a\u00a0pretty good\u00a0understanding of basic ROS constructs (topics, messages, launch files, Catkin, etc.), general programming tools\u00a0(conditional statements, loops, functions, breakpoints, etc.) and &#8212; most importantly &#8212; were already asking &#8220;what&#8217;s next?&#8221;<\/li>\n<\/ul>\n<p><img decoding=\"async\" loading=\"lazy\" width=\"1024\" height=\"486\" class=\"aligncenter size-large wp-image-774\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/rcap_workshop_action-1-1024x486.png\" alt=\"\" \/><\/p>\n<p style=\"text-align: center;\"><em>Workshop in action &#8212; Courtesy of Team Nalanda<\/em><\/p>\n<p>The general consensus was that both MATLAB and ROS were needed to make this workshop happen. Implementing and testing algorithms was made accessible by\u00a0MATLAB, whereas installation of existing ROS packages facilitated some of the necessary low-level sensing and actuation, as well as mapping, capabilities.<\/p>\n<ul>\n<li>Many ROS\u00a0packages were easy to set up and could deliver powerful results right away. However, understanding the underlying code and build system to modify or extend these packages was nontrivial for beginners.\u00a0This is likely because ROS\u00a0is designed for users\u00a0comfortable with a rigorous software development process.<\/li>\n<li>On the other hand, MATLAB needed a single installation at the beginning, required no\u00a0recompilation, and the sample code (both our workshop files and documentation examples) was determined easy to follow, debug, and modify.<\/li>\n<\/ul>\n<p>Heramb Modugula (Team Nalanda) remarked that &#8220;plenty of time was available to tinker with the robot and example code, and eventually, coding on our own&#8221;. His coach and father, Srikant Modugula,\u00a0highlighted integration of software component as the most critical task. &#8220;While MATLAB provides a powerful framework to do robotic vision, motion, and arm related tasks, we look forward to an easier way to connect it with ROS enabled TurtleBot and seamlessly compile\/run multiple programs.&#8221;<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-776 size-full\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/workshop_vision-2-e1515610202885.jpg\" width=\"944\" height=\"500\" \/><\/p>\n<p style=\"text-align: center;\"><em>Computer vision and manipulation at the workshop &#8212; Courtesy of Dr. Jeffrey Tan<\/em><\/p>\n<p>To summarize, MATLAB and its associated toolboxes are\u00a0a complete\u00a0design tool. This includes a programming language, an interactive desktop environment (IDE), graphical programming environments like\u00a0<a href=\"https:\/\/www.mathworks.com\/products\/simulink.html\">Simulink <\/a>and <a href=\"https:\/\/www.mathworks.com\/products\/stateflow.html\">Stateflow<\/a>, apps to help with algorithm design and tuning, and <a href=\"https:\/\/www.mathworks.com\/solutions\/embedded-code-generation.html\">standalone code generation tools<\/a>.<\/p>\n<p>Our recommended approach\u00a0is to use MATLAB and Simulink for prototyping\u00a0algorithms that may be a subset of the entire system, and\u00a0then\u00a0<a href=\"https:\/\/www.mathworks.com\/help\/robotics\/examples\/generate-a-standalone-ros-node-from-simulink.html\">deploying these algorithms as standalone ROS nodes using automatic code generation<\/a>. This way, the robot\u00a0does not\u00a0rely\u00a0on the MATLAB environment (and its associated overhead) at competition time.\u00a0For more information, please refer to <a href=\"https:\/\/blogs.mathworks.com\/racing-lounge\/2017\/11\/08\/matlab-simulink-ros\/\">our Getting Started with MATLAB, Simulink, and ROS blog post<\/a> or reach out to us.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"aligncenter wp-image-742 size-full\" src=\"https:\/\/blogs.mathworks.com\/racing-lounge\/files\/2018\/01\/competition_shots-e1515438379673.jpg\" width=\"967\" height=\"400\" \/><\/p>\n<p style=\"text-align: center;\"><em>Find the MathWorks logos!<\/em><\/p>\n<p style=\"text-align: left;\">Our goal is that challenges like these will lower the entry barrier for new teams from around the world to join the RoboCup major leagues and perform competitively in their first year.\u00a0This will create opportunities for newcomers to become comfortable with robot programming\u00a0and eventually transition to\u00a0being &#8220;true&#8221; major league teams &#8212; that is, bringing\u00a0state-of-the-art algorithms to RoboCup and pushing the boundaries of robotics research worldwide.<\/p>\n<p style=\"text-align: left;\">For this reason, we will work towards offering this workshop at future events, as well as open-sourcing our materials and posting them\u00a0online. If you are interested in using this material to learn or teach, or have any thoughts to share, please leave us a comment. We hope to see more of you sign up for future RoboCup@Home Education challenges. Until next time!<\/p>\n","protected":false},"excerpt":{"rendered":"<div class=\"overview-image\"><img src=\"https:\/\/blogs.mathworks.com\/student-lounge\/files\/2018\/01\/WorkshopPic-e1515007709965.jpg\" class=\"img-responsive attachment-post-thumbnail size-post-thumbnail wp-post-image\" alt=\"\" decoding=\"async\" loading=\"lazy\" \/><\/div>\n<p>The Racing Lounge is back from Christmas break. I wish you all a succesful and healthy 2018!<br \/>\nIn today&#8217;s post,\u00a0Sebastian Castro\u00a0discusses his experiences with a robotics workshop he helped&#8230; <a class=\"read-more\" href=\"https:\/\/blogs.mathworks.com\/student-lounge\/2018\/01\/17\/robocupathome-education-workshop\/\">read more >><\/a><\/p>\n","protected":false},"author":155,"featured_media":696,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[6,14],"tags":[25,15,24,54],"_links":{"self":[{"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/posts\/680"}],"collection":[{"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/users\/155"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/comments?post=680"}],"version-history":[{"count":53,"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/posts\/680\/revisions"}],"predecessor-version":[{"id":2856,"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/posts\/680\/revisions\/2856"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/media\/696"}],"wp:attachment":[{"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/media?parent=680"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/categories?post=680"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/student-lounge\/wp-json\/wp\/v2\/tags?post=680"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}