Controller upgrade

image

I’ve decided to upgrade the remote controller for the robot. Adafuit.io has the above component you can add to your dashboard. This provides a lot more commands than the original

image

one that I have been using as can be seen above.

My plan now is to add a pan and tilt camera to the robot so it can ‘see’. I’m also working on creating some additional parts for the robot to hold the 6V batter case as well as make the front platform more accessible. I’ll basically place it above the 6V battery which will sit over the front wheels.

I am also working on a way to better secure the bread board onto the robot rather than using a bulldog clip. It is all getting rather crowded up there, so creating some more space will be good.

It seems like the camera interfaces to the ESP32S2 Wroom using a set of SPI connectors which are:

image

which I found here:

https://docs.arducam.com/Arduino-SPI-camera/MEGA-SPI/MEGA-Quick-Start-Guide/

most of which I can see on the board:

image

The one that is missing is CS = 7. I found this after some hunting:

pin 7 on the ESP32-S2 Thing Plus WROOM is the IO4 pin

I am not sure whether it os true but I’ll try:

image

GPIO04 on the other side of the chip as shown above as the pin for CS.

I bought this camera:

https://core-electronics.com.au/arducam-mega-3mp-camera.html

AC-B0400-5

which has pinouts:

image

Once I get it all connected then I need to write to code to capture images. There are lots of examples of doing that with an app on your desktop but I want the camera to capture an image and send it up to adafuit.io which it seems I can do. No sure of exactly how just yet, but first step is getting the camera hooked up and being able to view the images it captures.

Robot with distance

As I detailed in a previous post:

Mecanum motion

I had my robot moving and taking commands from an Internet based dashboard. The missing piece was to get the distance sensor operational, which I have now achieved.

I firstly need to print a mount to allow the distance sensor to be mounted to the buggy frame. I have uploaded that to the CIAOPSLabs repo here:

https://github.com/directorcia/ciaopslabs/blob/main/3dprint/mounts/VL53L0X-distance.stl

image

With the VL53L0X sensor now mounted and connected to the processor the layout now looked like the above image.

Basically, the motor controller, distance sensor and LCD display all communicate with the ESP32-S2 processor via the SDA/SCL bus. They achieve this by all being on a different address.

It was also important to ensure that I connected up the wheel in a know sequence because to drive the mecanum motion I needed to turn different wheels to make it move in certain directions per:

I’ve uploaded the initial code with it all working here:

https://github.com/directorcia/ciaopslabs/blob/main/Projects/14/moveanddistance-v1.c

The commands on the keyboard are:

image

1 = Left forward 45

2 = Forward

3 = Right forward 45

4 = Left 90

5 = Stop/Start

6 = Right 90

7 = Left back 45

8 = Back

9 = Right back 45

* = Slower

0 = Spin on the spot

# = Faster

As the robot moves it displays the distance on the LCD display like so:

image

The robot starts with speed = 0 (i.e. stationery). You press 5 to set the speed to 100 but it will not yet move until you give it a direction. If you now press 2, the robot will move forward at a speed of 100. You can then happy go along changing directions via the keypad. If you press 5 again, the robot will stop moving.

With all this now working, the next update will be for the robot to use the distance sensor to determine how far away it is from object (at the front), slow and stop if necessary to avoid hitting these objects.

I want to also optimise the code to make it more responsive if I can and I’ll post the updates here and to the CIAOPSLabs repo when I get it all working.

Beyond that I’m still trying to decide what to get the robot to do? If you have any suggestions, let me know but I’m kind of thinking that the robot needs to have ‘vision’ next!

DFRobot 1602 LCD display mount

image

Part of recent Mecanum motion project required me to design and print a frame for the DFRobot 1602 LCD display as shown above. It is basically a right angle bracket that allows it to be mounted onto the frame.

I’ve uploaded the STL model to:

https://github.com/directorcia/ciaopslabs/blob/main/3dprint/mounts/DFRobot-1602-LCD-disp.stl

so you can grab a copy and print it out for yourself. If you don’t have access to a 3D printer then send me a donation via:

https://ko-fi.com/ciaops

to cover the postage at least and I’ll send you a print.

Paint mixer

image

One the problems I’ve recently solved with 3D printing was to create a simple paint mixer (shown above).

Basically it is a block with a few divots in it. This allows you to easily a small amount of paint in each divot. The advantage over a flat area is that the mixing process doesn’t spread the paint too broadly. This solved a problem for me when I am building my plastic models and need a small amount of colours mixed together.

The dimensions of the mixer are:

76.8 mm long

19 mm wide

4.8 mm deep

I’ve uploaded the STL model here:

https://github.com/directorcia/ciaopslabs/blob/main/3dprint/tools/Paintmixer.stl

so you can grab a copy and print it out for yourself. If you don’t have access to a 3D printer then send me a donation via:

https://ko-fi.com/ciaops

to cover the postage at least and I’ll send you a print.

The wheels are turning

mc-wheels

Hopefully you can see from the above video that I have now got all the wheels turning on the frame in the same direction.

With the motor basics done, next I’ll get the rig talking to WiFi and outputting information to the LCD display.

Following that, the next step is to determine how to move in different directions with these Mecanum wheels.

New chassis

image

I’ve invested in a new chassis for my robot creations. This is the one I opted for:

Mecanum Wheel Chassis Car Kit with TT Motor, Aluminum Alloy Frame, Smart Car Kit for DIY Education Robot Car Kit

from Amazon. It is great quality, strong and well made. Has lots of connection options and the right size for what I want. It is also robust enough to survive the inevitable ‘incidents’ it will no doubt have along the way. In short, I think it is great value for money and very professionally made.

As you can see from the above picture, I have managed to put it together, even though it didn’t come with assembly instructions. Those I found here:

https://www.hiwonder.com/products/mecanum-wheel-chassis-car

and specifically on this YouTube video – https://youtu.be/WMQ-PyM-PNE

The motor connections are JST 2-pin female connectors so I have also ordered these:

JST PH 2-Pin Cable – Male Header 200mm

to allow easy connection to the motor driver I’ve already played with here:

https://blog.ciaops.com/2023/07/12/iot-motors/

Once I get all that I can start putting together the controllers and then start writing the code to make it actually move about.

Stay tuned.

Displaying distance on LCD screen

iot-dist

I was able to take the

Adafruit VL53L0X Time of Flight Distance Sensor

and combine it with all my recent LCD learnings to produce something that outputs a distance measurement as shown above.

Here is the circuit outline:

image

Note that it is important to ground the ESP32-S2 Thing Plus WROOM to the same ground as the other components.

The code for this project is at:

https://github.com/directorcia/Azure/blob/master/Iot/ESP32-S2/VL53L0X/LCD-Test.cpp

which is pretty straight forward and basically a combination of the sample files for the distance sensor and the LCD display.

Easy when you know

I’ve been attempting to get an LCD display connected to the ESP32-S2 WROOM controller but wasn’t having any luck:

No output to display

Luckily, I worked out that I needed to:

Connect the grounds

to make it work. For that I’d used a:

Gravity:I2C LCD1602 Arduino LCD Display module

which has everything built into the board.

After finding the error of my ways I wanted to circle back and get the original

Standard HD44780 LCD

display working with the

I2C LCD Backpack for 1602 to 2004 LC

Once I had wired the backpack to the LCD display (basically just a process of aligning pins on the breadboard), I connected up power and SCL and SDA to the backpack. I connected to the SDA and SCL on the ESP32-S2 Thing Plus as well as ensuring the grounds of the ESP32-S2 Thing Plus and the external power for the display were connected!

For the code I added the LiquidCrystal_I2C library and then the header:

#include <LiquidCrystal_I2C.h>

next I initialised an object:

LiquidCrystal_I2C lcd(0x27,16,2);

with an module address at 0x27 (the default) and 16 columns by 2 rows.

In setup I initialised the module by:

lcd.init();

turned on the backlight:

lcd.backlight();

and finally output text to the LCD display:

lcd.print(“Hello, world!”);

Once I compiled the code and uploaded to the ESP32-S2 Thing Plus I saw:

image

which was very satisfying.

The learning from all this has been to ALWAYS ensure that all the grounds are connected together. However, I’ve also learned that life is much easier with a completely integrated display like the:

Gravity:I2C LCD1602 Arduino LCD Display module

rather than trying to configure just the LCD directly or even be interfacing with a dedicated I2C backpack. It is far easier if all that stuff is built into the LCD module and all the control comes from SDA and SCL.

Lessons learned. Bring on the next challenge.