This project is a robot v.s. human Connect 4 game where the human plays the robot on a physical game board as if they were playing another player. The vision is to have a camera scanning the board and then motors move a dispenser along the top of the connect four board to drop in a piece for the robot. The human player would place their piece in a column, then the computer scans the board and uses an algorithm to determine the best column to place a piece, then the robot navigates to the correct column and places a piece.
MotivationI was motivated to create this project because I connect four is one of my favorite games and I wanted to gain experience building a robot. Upon further research, I found that no one had made a Connect 4 robot that was as intuitive of playing a normal connect four game.
Current Version of ProjectDue to a lack of time, I was unable to complete the mechanical side of this project. Currently, my project has everything finished that would enable the robot to make its turn except for the 3D printed components. So the human player places their piece, the computer scans the board and determines the best column, the robot turns the stepper motor and servo as if it were dropping a piece, and finally the computer prompts the human player to place a piece in the column it asks. (This project uses a PocketBeagle, which is nearly equivalent to a Raspberry Pi)
Additionally, I created a custom PCB for this project which contains all the needed components for this project: https://github.com/SMSARVER/ENGI301/tree/main/Project_2
This Summer, I am working to upgrade the stepper motor from a tiny kit motor to a powerful Nema 23 motor. I will also construct the robot and I am to have a fully functional prototype by August 2022.
Building the ProjectStep 1: Text Based Game
I based my project off of Keith Galli's virtual Connect 4 game: https://github.com/KeithGalli/Connect4-Python
I removed all the visual pygame elements and instead made it text based.
Step 2: Testing Components
The first component I tested was the 16x2 character LCD display. Wiring was very simple but I had to install the adafruit_character_lcd.character_lcd library to connect to the display. The LCD display needs a potentiometer to control the contrast of the display.
Next I worked on connecting the stepper motor. The stepper motor will move the piece dropper along the top of the Connect 4 board to the right column via a belt and pulley or a linear actuator. I used a stepper motor driver in order to more easily control the stepper motor. The stepper motor driver requires a 5V input in order to control it but the PocketBeagle can only output 3.3V. I used a Logic Level Converter to convert the signals from 3.3V to 5V. I used an external power supply (5V, 2A) to power the stepper motor. Wiring was simple and I found some test code to use from https://github.com/petebachant/BBpystepper.
Next I connected the 360 degree continuous rotation servo motor. The servo motor will be used in the piece dropper to drop one piece into the right column. The servo motor also required a 5V power supply and I controlled the servo through the PocketBeagle's PWM pin. I used servo motor test code from my instructor, Eric Welsh.
The last piece of hardware to connect was the USB camera. Connecting the camera required connecting (I just soldered them) the VB and VI as well as the ID and GND pins together. To test the camera, I used some basic code in the terminal:
python3
>>> import cv2
>>> cap = cv2.VideoCapture(0)
>>> ret, frame = cap.read()
>>> cv2.imwrite("temp.jpg", frame)
Step 3: OpenCV Code
Writing the OpenCV code was the most challenging part of this project. I started writing code to work on an image of a Connect 4 board and then modified it to work with photos from my USB camera. I put a green background behind the game board to make it easier to detect open spaces. To go from a picture of the board to a 6x7 array (what the main game code uses to represent the board) with the value of each space on the board, I first isolated the parts of the image (creating a "mask") that were red (robot spaces), blue (human spaces), and green (empty spaces). Then I used an OpenCV function called HoughCircles to detect the red, blue, and green circles on each mask. With all the circles detected, I sorted them based on their y and x positions in the image and then used this information to create the 6x7 array. Below is a photo of the code detecting the red circles on an image.
Step 4: Implementing Everything in the Main Code
For the last step in this project, I converted the test component files (for the servo, LCD display, and stepper motor) into classes and the OpenCV code into a function and implemented them in the main game code.
Below are seperate videos of a game versus the robot using the OpenCV detection and the stepper motor with the servo motor. I did two seperate videos to focus on the different parts of the project finished so far.
Hardware Video:
OpenCV Video: (see code overview video for how OpenCV code works)
Wiring:This image showing the PocketBeagle pins will be helpful in wiring:
LCD Display:
LCD_VSS to P2_15 (ground)
LCD_VDD to P2_13 (v_out)
LCD_V0 to Potentiometer center pin
LCD_RS to P2_10
LCD_RW to P2_15
LCD_E to P2_17
LCD_D4 to P2_2
LCD_D5 to P2_4
LCD_D6 to P2_6
LCD_D7 to P2_8
LCD_A to P2_ 13
LCD_K to P2_15
Potentiometer right pin to P2_13
Potentiometer left pin to P2_15 (ok to swap P2_13 and P2_15 on potentiometer)
USBCamera
P1_5 to P1_7
P1_13 to P1_15
P1_7 to USB_VCC
P1_9 to USB_D-
P1_11 to USB_D+
P1_13 to USB_D
USB_D to USB_GND
Stepper Motor Driver (SMD), Logic Level Converter (LLC), Continuous Rotation Servo (CRS), Power Supply (PS):
LLC_VA to LLC_0E
LLC_VA to P2_13
LLC_A1 to P2_18
LLC_A2 to P2_20
LLC_A3 to P2_22
LLC_A4 to P2_24
LLC_VB to SMD_+
LLC_VB to PS_+ (5V)
LLC_VB to CRS_+
CRS_- to P2_15
CRS_PWM to P1_36
SMD_- to P2_15
LCC_B1 to SMD_1N1
LCC_B to SMD_1N2
LCC_B3 to SMD_1N3
LCC_B4 to SMD_1N4
PS_- to P2_15
Connect stepper motor to SMD via white click-in mechanism
See additional photos of wiring to aid you:
Libraries to install:
sudo apt-get install python3-numpy
sudo apt-get install python3-opencv
pip3 install adafruit-circuitpython-charlcd
sudo pip3 install Adafruit_BBIO
Code Overview Video:
GitHub: https://github.com/SMSARVER/ENGI301/tree/main/connect4
Next Steps and Improvements:This Summer and next semester, I plan to construct a fully functional prototype of the robot. This will require:
1) Constructing a mechanism to move the piece dropper to the right column. I am leaning towards using a linear actuator mechanism to do this best. I also plan to add a limit switch for safety.
2) Upgrading the stepper motor to a faster and more powerful motor. I have selected the Nema 23 stepper motor to move forwards with. This motor requires a new motor and a 24V power supply.
3) Designing and 3D printing the piece dropper. Picture below of my current design:
4) Construct a base to attach the Connect 4 board and construct a camera and light stand to keep the camera and illuminating lights in the optimal locations.
5) Use "cron" to make my project automatically boot upon startup of the PocketBeagle. This way, my robot can be fully standalone.
Comments