Autonomous Systems

Design, develop, and test autonomous systems with MATLAB

Using MATLAB for Robotics Education

 

 

This post is from Peter Corke, a professor, researcher, teacher and writer about robotics and robotic vision. 


The journey

My journey into robotics began in the mid 1980s and I learnt about homogeneous transformations and kinematics from the books by Paul[1] and Fu et.al.[2]. I wrote code to implement those algorithms, initially in C, so that I could control a Puma 560 robot.  By the time I started my PhD in 1990 I’d been using MATLAB for a few years and found it to be very productive.  I began to reimplement a lot of my C code in MATLAB, starting with primitives for creating transformation matrices, roll-pitch-yaw angle and other conversions, and robot arm forward and inverse kinematics.  In 1995 I was very fortunate to attend my first ever International Symposium on Robotics Research (Herrsching, Germany) where I got to meet many of the big names in robotics.  I demonstrated my humble toolbox to some folk there, they said nice things, and encouraged me to make it publicly available.  Anybody still remember ftp servers?

John Craig chose the toolbox for use in the third edition of his well known textbook[3].  Over time I added more functionality to the toolbox, e.g., quaternions, mobile robotics, and extended the manual with more and more tutorial content.  In the early 2000s I started to think about writing a book that would have a big emphasis on code and code examples.  In 2009 I had the opportunity to start writing that book and it was published in 2011.

I began an active collaboration with MathWorks and had some input to the Robotics Toolbox as it was developing.  Through that process I go to know Remo Pillat who leads the Robotics Team.  In 2015 I spent a few months at MathWorks working with Remo and the robotics team, and met Witek Jachimczyk who headed up the Computer Vision Team.  I had a personal agenda to get as much functionality from my own toolbox into the MathWorks products, selfishly so that there’d be less personal toolbox maintenance for me to do.  It was during this visit that I had the idea of factoring out all the code related to what I started calling “spatial math” into its own toolbox – a set of functions and classes that implement all the mathematical objects we use to represent orientation and pose, in 2D and 3D, and all the conversions between those types.  This includes Lie Group matrices SO(2), SE(2), SO(3), SE(3), quaternions, roll-pitch-yaw angles, angle-vector pairs, exponential coordinates, twists and so on.  I quickly refactored my own toolboxes and the Spatial Math Toolbox was born.  I gave a lunchtime “brown bag” seminar at MathWorks and rather audaciously proposed that all this functionality should go into the MATLAB core and be used by all toolboxes, rather than have different partial implementations for each toolbox. Now in 2023 much of this functionality is included in MATLAB, some in the core and some is included with, and common to, selected toolboxes.

Third edition of the book

By 2020 it was time to start thinking about a third edition of the book.  My toolbox was nearly 30 years old, and it was clear that the MathWorks toolboxes developed over the last decade could do everything that mine could do, plus more. Dropping my toolbox meant reworking every line of code in the book (there are over 2500 lines of MATLAB code), modifying the narrative to suit, plus getting up to speed with all the relevant toolboxes in the MathWorks product line.  Or I could invite my friends Witek and Remo to join as authors.  We made a pact and started in 2021 and it has been a thoroughly enjoyable and productive collaboration. The book benefited across the board by having new eyes looking at it, and Witek and Remo were able to bring not only their own perspectives, but also those of their customers.  The manuscript was prepared using LaTeX and we communicated using GitHub, Teams and email and had a weekly coordination meeting. The book will be out in late in the first quarter of 2023.

 

Teaching robotics at university

My day job is being a professor at the Queensland University of Technology, and I’ve made significant use of MATLAB for teaching over the last decade.  I use MATLAB Grader to set problems for students, and test their ability to understand problems by how well they articulate solutions as MATLAB code.  Grader provides instant feedback which the students appreciate, and I don’t have to grade their papers which I appreciate.

I believe it is important to provide students with access to physical robots but this is challenging for large classes.  Last year we used MATLAB to share one robot across a class of 180 students — students emailed their MATLAB code to a server connected to the robot. The task was to type a given 5-letter word on a keyboard and the solution involves coordinate transformations, inverse kinematics and simple path planning.  The students did their development using MATLAB and CopelliaSim via a simple API. The same API was used when their code ran on the server, and the server returns a detailed log file as well as a video of the robot doing the task.  The server enforces motion limits to prevent damage to the robot and the keyboard.  The whole runtime was written in MATLAB and the email handling was done with PowerAutomate.

The robot ran unattended in a locked lab but the students could see it through the glass wall, and a screen showed the identity of the group whose job was currently running. The students found this very engaging and the robot was available 24×7 to run their jobs.  It was a delight to see students watching their robot through the glass and high-fiving after their submitted job ran successfully.

In our mobile robot course we use small custom-built[4] mobile robots that include a Raspberry Pi and camera.  An onboard web server allows a user to send motion commands, read odometry or grab an image from the robot’s camera.  A RESTful web API allows the robot to be controlled from a students’ laptop and we provided a MATLAB API.  The robots have IR LEDs on top, and an overhead RaspberryPi and camera system can provide localization information.  We use this setup to teach mobile robot motion control, moving to a point or following a line, path planning, obstacle avoidance, and visual SLAM.

 

[1] R. P. Paul. Robot Manipulators: Mathematics, Programming, and Control. MIT Press, Cambridge, Massachusetts, 1981.

[2] K. S. Fu, R. C. Gonzalez, and C. S. G. Lee. Robotics. Control, Sensing, Vision and Intelligence. McGraw- Hill, 1987.

[3] J. Craig. Introduction to robotics: mechanics and control. Prentice Hall, 2004.

[4] https://github.com/qcr/PenguinPi-robot

|
  • print

Comments

To leave a comment, please click here to sign in to your MathWorks Account or create a new one.