Damien works with students and teacher from around the world, bringing the effective use of technology to the classroom.
LEGO MINDSTORMS Damien is a member of the MCP (Mindstorms Community Program), a small group of experts who collaborate with LEGO to make the MINDSTORM product better.
VEX IQ Robotics Damien is a member of the VEX IQ Super User group, a small group of experts who collaborate with VEX to make the VEX IQ platform a better product
So there was a question that came through the EV3 Facebook group by Ana
"Hello! Has anyone tested (or is there any documentation about) the line width that the EV3 light / color sensor is capable of reading? "
Having just finished up a project and looking for something small to help me procrastinate about starting my next big task, I thought "This is an awesome way to waste time / learn something new". And while knowing the answer is very helpful, I'm also really keen to learn from the process of getting the answer and figured the datalogging capability of the EV3 would be perfect.
It's not a difficult thing to do, and something that you could do in your classroom / robotics club. Firstly I put together a small document that has a series of lines on it. It starts at 1mm, and goes all the way up to 20mm in width. I made sure that there was a 15mm gap between each line. This was all done with Microsoft Word, and if you'd like my version, you can download it here. linetest.pdf (check your scaling if you print this out)
I then built an extrememly simple robot based on my quick build, but with the Colour Sensor on a boom mounted over the side. I didn't want the robot driving over the lines and potentially disturbing them. The boom arm is also adjustable so I can raise and lower the height of the Colour Sensor.
I put together a very simple datalogging program as follows. Start the robot driving (15% power) Start Datalogging (1000 samples per second, 5 seconds) After Datalogging has finished, stop the robot
Setup the robot over the paper, and run the experiment. I then moved the Colour Sensor up one Technic hole on the boom arm, and ran the experiment again. I did this for a total of 4 different Sensor heights.
Looking at the data shows some very interesting results.
Lowest Level: You can quite clearly see the 13 different lines as represented by the 13 dips in the graph. Up to the 4mm line, the robot clearly sees each individual line (Colour sensor going all the way down). However while the 1 / 2 / 3mm lines all show a 'dip', it doesn't get as low as the other lines. You get some weirdness on gaps between these later lines, but I think I may have had slightly wavy paper that accentuates the readings at such a close distance. (click for big version)
1 Technic Hole up: You can still quite clearly see the 13 different lines, however by time you get to the 5mm line, it is only 50% of the original reading and subsequent lines have an even smaller drop. This is to be expected, as the Colour Sensor is higher up which means red light spreads out more as it reaches the ground and the lines now form a smaller percentage of the overall area 'seen' by the red light.
2 Technic Holes up: I would say that the robot could 'see' up to the 5mm line. It's iffy though and while our human eyes can discern the pattern from the graph, for 5mm, the robot is looking for a drop of 3 percentage points.
3 Technic Holes up: Again, as we would expect, with the sensor so high up now, it's hard for the sensor to be able to pick out the lines. At a stretch I'd say it could see the 20mm, 15mm, 12.5mm and 10mm but you'd be optimistic to say you could see any other lines.
Conclusion: So the question was, what line width is the EV3 capable of detecting. The answer here is (as with almost all robotics) "It depends".
If you're going slow enough, and you know what values you expect from the light and dark areas of your mat, you could conceivable tune your program to have your robot stop on a 1mm thick line. However, this would have to be very finely tuned and wouldn't take into account all the other weird and wonderful colours and surfaces you may conceivably drive along in getting to that 1mm thick line.
This took me a total of about 2 hours to think about, build, program and analyse. It is definitely within the reach of your typical students to run their own version of the experiment and come up with some results.
Possible modifications your students could run: Keep the line widths the same, but vary the greyscale percentage of the line ie. 100% black, 80% black, 60% black etc. Keep the height of the Colour Sensor the same, but vary the speed of the robot
If you do run this with your kids, let me know, I'd love to see your data!
PS. Here is all the data on one graph. The lines don't line up because, as much as I tried, I couldn't get the *exact* same starting position each time. If I did it again, I might setup a starting "trigger" to make sure they all line up together.
At the VEX World Championships in 2016, I was really lucky to have the opportunity to drive around Steve Hassenplug's full sized VEX R2D2. As I was doing it, I was amazed at how everyone knew R2 and wanted to interact with it. I've always loved R2 from the Star Wars movies, and so in the back of my mind I've always wanted to build my own version. Fast forward a few months and I had a bit of downtime with work and so built my first version. Unfortunately I didn't have a wide selection of colours and so my first version came out predominantly grey.
It got a lot of positive comments from plenty of people so I figured I do a version 2, in the right colours :)
Version 2 was really well received and I got so many requests for building instructions! I drove him nearly non-stop throughout the entire 2017 VEX World Championships and it was so much fun to watch everyone interact with such a great little droid. I had a request for R2 to be displayed at an interstate conference so I packed it up in plenty of bubblewrap and then waved goodbye as the couriers collected the box. Unfortunately disaster struck and R2 never made it to the conference and disappeared somewhere enroute :(. A few weeks of frantic calls from the organisers and still no luck in locating him :(
So on to version 3, and this time I decided to design in LDCadVEX first, based on photos and video I had of Version 2. I was travelling for work so holed up in a hotel for a few nights in a row and managed to get 80% of the CAD model done in that time. With the help of the awesome Andreas Dreier (vexiq.dreier-privat.de/projects/r2d2/r2d2.html) and Steve Hassenplug (teamhassenplug.org/VEX/R2/), they took the CAD model with my crude step-by-step breakdown and turned it into a series of Building Instructions that anyone can follow. Having a few sets of eyes go over my original design was fantastic, with both Steve and Andreas often suggesting changes here and there to make the instructions easier to follow and a better model in the end.
Rogues' Gallery
Have you built your own VEX IQ Astromech Droid? Please send me a photo, I'd love to see what you've come up with and add your version to our little gallery.
Version 1
Version 2
The awesome Team at IFI (VEX Headquarters) made their own version
If you'd like to have a go at building your own version, you can download the instructions here. Be warned, it's not the easiest build and you'll need a standard VEX IQ kit as well as lots of spare parts. If you're missing a particular part, that is an awesome opportunity to flex your creative/engineering mind and come up with a work-around. If you do come up with something clever, please let me know about it!
Every year we have a LEGO fan expo and I love to make creations that are interactive. All too often kids are not allowed to touch anything at these expos and so they love it when they have a chance to touch something.
I first showed off this robot in 2016, but in 2017 I made some changes to make it a little more reliable. Quick video below on how it works and if you keep scrolling, I'll outline how the software works.
Software
The whole program was written in the EV3-G software, and is a nice example of how to use the Array functionality.
click for larger version
The software works as follows
Wait for the Green button to be pressed (this is the start button) and initialise an array that is 8 long
Chose a random number between 1 and 3 and then fill the array with 8 of these random numbers. Each random number corresponds to a particular chime to be played.
I wasn't happy with the default Random Number Generator as it was giving me way too many of the upper and lower bound values (1 and 3 in my case). So I wrote my own. Basically it chooses a random number between 0 and 4 and if it is not inside the range 1-3, it re-chooses the random number.
Play the sequence according to what round the player is up to. ie. if the play is at level 1, only play the first chime of the sequence. If the player is at level 6, then play the first 6 chimes in the sequence
Once the sequence has finished, wait for the player to press the corresponding buttons. Each button is assigned a number that matches the chime number. By checking if the button number matches the chime number, the robot can determine if the correct button was pressed.
If the player gets the sequence right, move to the next level, otherwise sound an Error alarm and reset back to the Start.
If the player gets through 8 rounds, then play all the chimes at once and have some cheering sounds. This is in a Switch to make sure if the player doesn't make it to the end (level 9) then they don't get the cheering.
So there it is! I have no doubt there are probably better / more efficient ways of doing it, but this works well. Let me know what you think :)
In this tutorial, I'll show you how to use the LEGO MINDSTORMS EV3 datalogging capabilities to measure how fast a fidget spinner spins.
I use a Colour Sensor in reflected Light Mode, positioned above one of the lobes of the spinner. When the sensor is pointed directly at table, it gives a high reflected light reading, however, when one of the lobes of the spinner passes under the colour sensor, the reflected light readings change. By measuring the time between each lobe passing under the sensor (and taking into account there are three lobes) the rpm of the spinner can be calculated.
I set the Experiment up to run for 10 seconds, and to take 1000 samples per second. The spinner is moving pretty quick so we want to grab as much data as we can.
After we run the experiment, we upload the data and get the following:
If we zoom in on the data, we can clearly see the lobes of the spinner as they pass under the Colour Sensor.
The first part is before we get the spinner spinning. By zooming in on some data, we can clearly see how the spinner reflects different amounts of light depending on what is directly under the Colour Sensor. I was surprised at how clearly you can see the various parts of the spinner. You can clearly see the silver parts of the bearing nestled inbetween the black moulded plastic.
Click for larger version
By measuring the time between any two common locations on the graph, you can calculate how fast one lobe of the spinner goes past. Keeping in mind that there are 3 lobes on my spinner, you can calculate the rpm of the spinner at different times.
By using the analysis tool, we can measure how long it takes the spinner to do one spin. You'll note I've taken the first 'high' part of the graph, where the spinner sees the white table, and then again at the 4th 'high' part. This is because there are 3 lobes of the spinner.
You can see that the start of my section crosses the x-axis at 2.16 seconds, and ends at 2.30 seconds. This gives a time of 0.14 seconds per revolution.
To convert into revolutions per minute (rpm) we divide 60 by the time for 1 revolution. 60/ 0.14 = 428 rpm.
I calculated the speed at the start and again around 8 seconds after it had started spinning. My spinner went from 428 rpm to 215 rpm during this time.
This would be an awesome experiment in class as it would be possible to see the relative slowdown between different types of spinners.
I was recently invited by CQUni to give a guest lecture. I spoke for about 45 minutes on the topic "Don't Teach Robotics, use Robotics to Teach". The main crux of the talk was that in education, we generally want to teach problem solving, computational thinking etc. The robots are just a really good platform in which to teach those concepts.
I talk quite a bit about the Australian Technologies Curriculum as well as how robotics can be used to meet that curriculum. We had a few audio issues at the start, but they are fixed by the 4 minute mark. The talk was live streamed out to half a dozen or more CQUni campuses around Australia.
It's been hot here in Australia. Maximums have been around 30C-35C all week and the humidity up over 50%. We get the occasional news story about the dangers of leaving kids and pets in cars on days like today and it is easy to understand why.
It's been on my to-do list for a number of years, to actually log what the temperature rise in a car is, and I got a few hours today to set up an experiment and find out (an hour was spent just trying to find my temperature sensor!)
So here is the setup. An EV3 Brick and a NXT Temperature probe (Your LEGO Education supplier should have them in stock).
I started up a new experiment in EV3-G and set the Experiment Units Setup to have a Duration of 20 minutes and a Rate of 5 Seconds between Samples.
I started the experiment with the EV3 on my outside table in the shade for 2 minutes to get a baseline reading of the air temperature.
After 2 minutes I drove my car out of the garage and parked it in full sun on the driveway. I put EV3 brick and sensor on the passenger seat, making sure it was still in the shade inside the car.
20 minutes later (well, 18 if you count the 2 minutes resting on the table) and I went to retreive the EV3 and uploaded the data. You can clearly see the 2 minutes resting before it goes into the car at around 32C - 33C (around 90F). It then rockets up to over 44C (112F) in the space of just over 15 minutes.
It is a great visual reminder of just how hot it can get in there, in a very short period of time!
I was asked a few weeks ago if I'd like a review copy of these cards and I thought, 'why not?' and then promptly forgot about them. Fast forward to last week and they rock up on my doorstep as promised :)
I was a huge fan of the old SCRATCH Cards, designed for version 1.4. They were freely available for download and I even went to the trouble of laminating several sets for use with my workshops. They were well put together with simple, small activities that kids could work through at their own pace and with 12 different cards, each highlighting a different aspect of SCRATCH, it was a great way to get kids up and running.
Things have been been greatly improved since, with the release of the box set of 75 (Yep, 75!) Cards for use with SCRATCH 2.0 from No Starch Press. These cards have been put together by Natalie Rusk, one of the lead developers of SCRATCH at MIT.
My first impressions; I'm really impressed. It is great to see them laminated (will make them last longer in a classroom). I also really like the way they have grouped them into projects based around theme. Each theme has a header card that briefly describes the project, and then lists the other cards in the series that you need to complete to finish the project. I'm a big fan of this method of instruction, guiding the kids into breaking down a large problem into several smaller problems (a central tenet of Computational Thinking) and then letting the students have lots of small 'wins' as they progress through the cards.
Some of the projects are structured such that you can complete the cards in any order, others need to be completed in a specific order.
Each card has two sides (obviously). On one side is the 'Aim' of the card (what the student hopes to achieve by completing the card, the other side gives some basic instruction on how to assemble the code blocks. It will point you in the right directions for finding sprites or changing backgrounds, but a rudimentary knowledge of Scratch certainly helps to fill in the gaps.
The only thing that I would have liked to have seen more of, are some prompts to get kids to make changes to their code and explore the effect on their program. There are a few 'tips' scattered amongst the cards, but a deliberate structure would have been nice. ie. "What would happen if you changed xxx to zzz? Try and and see what happens. Now you make your own change and see it it does what you expect".
I was recently sent the newest Humanoid robot offering from Meccano, the Meccanoid XL 2.0. It is apparently an upgrade from last years G15KS. Full disclosure, this was sent to me for free with no expectation that I send it back.
First up, it is a great build, my kids were drawn to it immediately. It is big (4' / 1.2m) tall which makes it quite a sight to see. The voice interactions are great and there is an impressive amount of commands you can use. Just a quick caveat, out-of-the-box, my Meccanoid had no response other than "I'd love to do that, right after you update my firmware". Firmware update was quick and painless.
Building:
Mechanically, the build was great fun for me. I love these kinds of systems and quite happily spent 5 hours putting this guy together. While the box says suitable for ages 10+, I think a 10 year old would really struggle and get quite frustrated at times. I like to think I'm quite good at these, and yet it still took me 5 hours. I think this is more suited to a parent/child combination working a few hours over several days.
When you open the box, there is literally a kilogram of nuts and bolts, so you know that you're in for a long build. The supplied hex-head screwdriver is great, but I found the accompany nut-holder tool pretty much useless. The times I needed something to hold a nut in a confined space, the tool let me down badly.
Meccano have copped flak for having all these parts in plastic rather than metal, but that didn't really phase me. One this this has allowed, is for the plastic parts to be moulded with nut housings and raceways. What this means is that you can place a nut in the housing, and once the bolt engages, it will hold itself in place, no tool necessary. There are a few parts where a 'raceway' has been moulded in, allowing you to slide the nut into the correct place, rather than having to delicately hold it into place. A lot of the angle connectors have moulder lugs in them which allowed for easy registration with their mating faces.
If you enjoy tinkering with mechanical stuff, you'll love putting this together.
The major downside was the instructions. I spend a *lot* of time teaching kids, and clear instructions are paramount. Meccano instructions were sadly lacking in many places. First up, they were too dark in places which made counting holes a little difficult (was it the 3rd or 4th hole from the end?) There were a few sequence of instructions in which I would wonder to myself "Why did they do it in that order?" With a little more thought put into them, they could have have made them a bit easier to assemble. When assembling the parts that have motor wires, they neglect to show you when to put the cables. On a few parts I inadvertently pinched the cables. Luckily I hadn't tightened the parts enough to cut through the cables and was able to reroute the cable along what I thought was the correct path.
Programming:
Meccano claim that there are 3 ways of programming the G15KS, There is no information on their website about the XL2.0 however. The L.I.M. (Learner Intelligent Motion) programming is a little better. In this mode you can manually move the robots arms, head and feet around, and the robot will 'remember' the actions and play them back to you. Again, this is a lot of fun, but it doesn't really introduce kids to the concepts of Computational Thinking or Programming.
The G15KS had a motion capture mode when you could put your smartphone in the chest of the robot, and by utilising the smartphone camera, you could get the robot to do the same actions as you (Think exercise instructor out the front and the robot mimicking your actions). This appears to have been removed for the XL2.0 but I can't find any info to support this.
Voice Activated commands are a lot of fun and a great way to play with the robot, but I wouldn't exactly call it 'programming'. After a protracted attempt to load the Firmware Update software on my computer (requiring Microsoft .NET 3.5SP1), I could upload the latest firmware. After a quick systems check by the robot (checking that it was moving the correct arms), we could jump into Voice Activate commands. I'm not sure if it was the Aussie accent, but both myself and my 6 year old son struggled to get this to work nicely. Sometime it would recognise a command, sometimes it wouldn't. Overall though, he enjoyed seeing the robot react to his voice far more than I did.
The last way is via the App. With this app, you are given a virtual Robot on your smart phone/tablet that you can manipulate by dragging the arms/head/feet around. Similar to the LIM system, it'll remember this and play back when you need it.Something I did stumble across (but not through any info on the official Meccano website) is a new feature called Behaviour Builder. This has a lot more appeal for me as it implied you could string together different behaviours, and have them triggered on different inputs etc. This would be a great way to start teaching kids about the fundamentals of programming, namely Sequence, Selection and Iteration.
Rag Doll avatar. You can click and drag his limbs aroundBehavior Builder
The Help screenshots are great for the basics. They tell you about the inputs and the outputs and what you can control. It seems a little incomplete however, as I found icons with no explanation about what they did.
Help Screens
The side palette toggles between inputs and outputs. On the inputs side, you can trigger the next action based on a few different things - Timer, Counter (To use as a FOR loop), Meccanoid Brain buttons or the Servo angle. This last one was the most fun, as you can trigger actions by lifting the arms of the robot or something similar.
On the output side, you can control the Feet motors, any of the Servo motors, make sounds and change the lights in the Eyes/Servos/Brain to just about any colour you want. You can also set in motion a animation sequence that you may have already created with the Rag Doll Editor. One thing that would have been nice would have been to have access to the pre-recorded movements that are available in Voice Command mode, ie. Kung-Fu, Dancing or Shake Hands. Unfortunately if you do want to do something similar, you have to plan out your own movements from Scratch.
I was able to do some basic programs whereby some basic inputs trigger some basic actions. When trying to do something a little more complex, using counters or timers, I couldn't seem to get them to work.
A bit more Googleing and I found these two awesome videos from Meccano. They were great in describing some of the more complex parts of the software. Hopefully they put out more of these as they were well done and fantastic in showing off the software capability.
This made things far more interesting, and would increase its relevance in a classroom situation. It is a little fiddly, but I eventually got the hang of creating an IF statement with the combined component. The Counter icon is a nice way to implement a FOR loop, but with no documentation on this function yet, I can't see kids figuring that out for themselves.
Creating an IF statement
Having spent so much time on a variety of other Graphical programming environments, there were a few things which felt a little clunky. I can foresee kids getting frustrated trying to cut connections and re-position icons. Sometimes when the icons are dragged, the connecting wires stay logically connected, but they redraw often in weird directions, sometimes going behind other icons.
The icons required to drag out are tiny and drag smoothly to where you drop them. I gave up on my phone after a while and switched to a tablet just to save my sanity. In all fairness, this is likely how a classroom would run the programming given the prevalence of tablets in class.
Conclusions:
This was a fun build for myself and while the software seemed very basic at first, I am beginning to see the potential of some more complex programming down the track. I however had a lot of experience with these types of platforms and I feel like a 'regular' kid might be at best, able to string together a few instructions and perhaps an occasional branched program.
Does this fit in a classroom? I don't think so. The build is way too long and fiddly to have kids working on it. The voice activated commands would get lost in the ambient noise of a classroom. The programming, while showing promise, just isn't quite ready for a novice classroom teacher to preset to a class other than the absolute basics. In addition, Meccano haven't released any Lesson Plans or Curriculum links so teachers would need to do those things themselves.
Is it an impressive robot for home? Definitely. It has a lot of great 'play' opportunities (although it struggles with our Aussie accents) and if there is a parent in the household that might be slightly techy / programming inclined, I can see the educational possibilities beyond just the 'play'.
Did I miss anything? Do you have something more to add that I've missed? Please let me know in the comments below.
This is a bit of a long spiel, of a topic that I've been mulling over for the last few years.
I’ve been teaching with robots for over 15 years now and one of the most common question I get from Teachers is “Which robot platform should I get for my school?” I’ve used over a dozen different platforms quite extensively and at the end of the day, I truly believe that it doesn’t really matter :)
In the education realm, we should never be solely focusing on ‘Teaching Robotics’; instead we should be using ‘Robots to Teach’. Just like any other educational tool, Robotics platforms are just a means to teach different concepts. Technology comes and goes, and in this day and age is seems to be coming and going faster every day. Teaching students a very specific tool (such as just a single specific robot platform) is fine for the moment, but unless the students understand the broader concepts behind the technology, once something newer and shinier comes along, they will be right back at the beginning, learning a new technology.
By using the platform to teach (rather than teaching the platform), we instil in our kids the ability to solve higher order problems, think more broadly and be more adaptable with the tools they have on hand. When a new technology comes along, they are more likely to understand the tool more rapidly and begin using the tool to help solve their challenges.
We use these platforms to teach programming, computational thinking, problem decomposition, mechanical engineering, branching statements, directional terminology and so on, and so on. The robot itself is just a platform that is used to teach these concepts so it doesn’t really matter which one you choose. There will be a variety of factors that will guide teachers in to choosing a platform that suits their school best and they should include;
Price. If there is a robot platform that is amazing, but it costs $5000 / robot, is that a better investment than an adequate platform that is $200 / robot? For the same amount of money, a cheaper robot can engage more students.
Availability: Can you easily get them in / into Australia (or whichever country you are in)? Are spare parts or add-ons easy to source?
Age appropriate Programming Language: Graphical or Text based? Do you need a platform that can span across both to appeal to a wide range of ages?
Curriculum Resources: Are educational based activities easy to come by? While it would be awesome to have the time to use robots in class because they are fun, in reality everything we do needs to be meeting some parts of our Curriculum. Are those activities affordable/ adaptable / assessable?
Teacher support: Often the ‘robotics’ teacher/s at a school might be only one or two teachers, which makes it a little more difficult to bounce ideas around. Many robotics platforms have good extended Educator communities in the form of mailing lists, forums etc.
Professional Development opportunities: Are your staff comfortable in using the equipment in class. Too often I’ve seen cupboards of equipment sitting idle in a classroom because the teacher who originally used it has now moved on and no-one else at school knows how to use the gear. Is the equipment easy to use and it is just missing a teacher willing to take it on?
Reliability: If you are spending too much time just getting the platform up and running, then that is time that could have been time spent solving challenges.
At the end of the day, I think the best robotics platform is the one that teacher feels most comfortable using. If they are comfortable with it, then they will teach with it, just like any other tool at their disposal.
I'd really appreciate any thoughts / comments / rebuttals you may have in relation to this. I'd prefer to keep all the conversation centrally located on my facebook page (www.facebook.com/domabotics) but feel free to reply on whatever platform you have read this on.