|
AMB23Realtek
|
x 1 | |
|
LED |
x 1 |
|
PyCharm |
|
|
MicroPython |
Hand Controlled LED | OpenCV | MicroPython | IoT
LED + OpenCV + MicroPython + AMB23 IoT Microcontrollers = Wireless Hand Control LED
Laziness is one of the biggest drives for engineer to innovate ways to simplify their lives. For me, I am too "lazy" to change the LED brightness manually, so I wrote a few Python script for both my PC and the Ameba IoT microcontroller and now I can just sit there and change LED beightness with a few hand gestures~
Demo
Introduction
Computer Vision has been around for quite a while but running computer vision application is always a little too demanding for microcontroller (MCU), as it requires a lot of computation and also burns a lot of power, thus it's better to offload the CV related computation to a more capable machine, for example our PC or a cloud server and let the MCU to control sensers/actuators in real time.
This little project has 2 main building blocks,
- Video capture and running of computer vision algorithm -> PC
- Running MicroPython for wireless data transmission and LED control -> AMB23
Steps
1. Video capture and Computer Vision Algorithm
To achieve hand gesture recognition, I chose to the well-known [OpenCV](https://opencv.org/) and [MediaPipe](https://mediapipe.dev/) libraries as they are open source project available on Github and they all have Python binding -- meaning I can key in all my logic using Python for quick prototyping. here is the flow chart for the logic,
The Python script,
import cv2
import time
import numpy as np
import math
import socket
import mediapipe as mp
########## Module #############
class handDetector():
def __init__(self, mode=False, maxHands=2, detectionCon=0.5, trackCon=0.5):
self.mode = mode
self.maxHands = maxHands
self.detectionCon = detectionCon
self.trackCon = trackCon
self.mpHands = mp.solutions.hands
self.hands = self.mpHands.Hands(self.mode, self.maxHands,
self.detectionCon, self.trackCon)
self.mpDraw = mp.solutions.drawing_utils
def findHands(self, img, draw=True):
imgRGB = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
self.results = self.hands.process(imgRGB)
# print(results.multi_hand_landmarks)
if self.results.multi_hand_landmarks:
for handLms in self.results.multi_hand_landmarks:
if draw:
self.mpDraw.draw_landmarks(img, handLms,
self.mpHands.HAND_CONNECTIONS)
return img
def findPosition(self, img, handNo=0, draw=True):
lmList = []
if self.results.multi_hand_landmarks:
myHand = self.results.multi_hand_landmarks[handNo]
for id, lm in enumerate(myHand.landmark):
# print(id, lm)
h, w, c = img.shape
cx, cy = int(lm.x * w), int(lm.y * h)
# print(id, cx, cy)
lmList.append( [id, cx, cy])
if draw:
cv2.circle(img, (cx, cy), 15, (255, 0, 255), cv2.FILLED)
return lmList
############## Variables ##################
wCam, hCam = 640, 480
pTime = 0
minBri = 0
maxBri = 1
briArd =0
############## Declararion ##################
cap = cv2.VideoCapture(0) # default camera is 0, if you have another cam, you can set to 1
cap.set(3, wCam)
cap.set(4, hCam)
detector = handDetector(detectionCon=0.7)
########## Step 1 ###########
# Start a TCP server and bind to port 12345
# Use ipconfig to check the IP address of your PC
s = socket.socket()
print ("Socket successfully created")
port = 12345
s.bind(('', port))
print("socket binded to %s" % (port))
s.listen(5)
print ("socket is listening")
c, addr = s.accept()
print('Got connection from', addr)
######### Step 2 ###############
# Image capture and processing using mediapipe and opencv
while True:
success, img = cap.read()
img = detector.findHands(img)
lmList = detector.findPosition(img, draw=False)
if len(lmList) != 0:
x1, y1 = lmList[4][1], lmList[4][2]
x2, y2 = lmList[8][1], lmList[8][2]
cx, cy = (x1 + x2) // 2, (y1 + y2) // 2
cv2.circle(img, (x1, y1), 15, (255, 0, 255), cv2.FILLED)
cv2.circle(img, (x2, y2), 15, (255, 0, 255), cv2.FILLED)
cv2.line(img, (x1, y1), (x2, y2), (255, 0, 255), 3)
cv2.circle(img, (cx, cy), 15, (255, 0, 255), cv2.FILLED)
length = math.hypot(x2 - x1, y2 - y1)
#print(length)
# Hand range 50 - 300
brightness = np.interp(length, [50, 300], [minBri, maxBri])
briArd = np.around(brightness,2)
#print(briArd, brightness, length)
if length < 50:
cv2.circle(img, (cx, cy), 15, (0, 255, 0), cv2.FILLED)
# Print FPS
cTime = time.time()
fps = 1 / (cTime - pTime)
pTime = cTime
cv2.putText(img, f'FPS: {int(fps)}', (40, 50), cv2.FONT_HERSHEY_COMPLEX,
1, (255, 0, 0), 3)
# Display image
cv2.imshow("Img", img)
cv2.waitKey(1)
# Sending the distance between our thumb and index wirelessly to IoT device
c.sendall(str(briArd).encode())
print("send data success")
print(briArd, str(briArd))
#c.close()
?
2. Running MicroPython for wireless data transmission and LED control
MicroPython is a lean implementation of Python 3 interpretor designed for microcontrollers and RTL8722DM_MINI supports MicroPython.
With MicroPython, you can control all the peripherals on the microcontroller, in the case of RTL8722DM_MINI, we can use only 13 lines of Python code to,
- Connect to Wi-Fi
- Start a TCP client socket
- Control LED brightness via PWM
Here is the MicroPython code,
import socket
from wireless import WLAN
from machine import PWM
wifi = WLAN(mode = WLAN.STA)
wifi.connect(ssid = "yourNetwork", pswd = "password") # change the ssid and pswd to yours
c = socket.SOCK()
# make sure to check the server IP address and update in the next line of code
c.connect("192.168.0.106", 12345)
p = PWM(pin = "PA_23")
while True:
data = c.recv(512)
f_brightness= float(data[:3])
print(f_brightness)
p.write(f_brightness)
Conclusion
With Wi-Fi and other wireless communications, IoT-enabled device like RTL8722DM_MINI can really be an integral part of an **AIoT** project. With Python language, product prototyping can be even faster and smoother.
---------------------------------------------
This project is also available on Hackaday.io at https://hackaday.io/project/184456-wireless-gesture-control-led/details
Hand Controlled LED | OpenCV | MicroPython | IoT
- Comments(0)
- Likes(3)
- Engineer Oct 01,2024
- Sebastian Mackowiak Mar 21,2023
- SimonX Jul 20,2022
- 0 USER VOTES
- YOUR VOTE 0.00 0.00
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
More by SimonX
- Minimalist 5G WiFi & BLE microcontroller In my other projects, I have demonstrated the power of the Ameba RTL8720D based IoT microcontroller ...
- DIY Handheld Thermometer with WiFi BackgroundNow is 2022 and we are still combatting COVID and its varients, it's a long battle so we n...
- Machine Vision Trip WIre | Switch App when Motion Detected IntroductionWanna slack off abit but scared to be busted? Here is the all-in-one wireless tripwire t...
- Hand Controlled LED | OpenCV | MicroPython | IoT LED + OpenCV + MicroPython + AMB23 IoT Microcontrollers = Wireless Hand Control LEDLaziness is one o...
- 3D Printed Collapsing Light Saber with RGB LED Let's watch a demo first! ??https://www.youtube.com/shorts/7kb5SGYx1NIBackgroundRecently I got obsse...
- DIY Function Generator in 3 Lines Code IntroductionMake your own programmable Function Generator using only 3 lines of Python code!Function...
- Dual-Band Wi-Fi Toolkit | 2.4 + 5GHz | A powerful Swiss Army Knife for Wi-Fi IntroductionI have previously demonstrated a few other projects such as1) 5G WiFi Scanner2) WiFi Sig...
- BW16 Stamp - Tiny 5GHz WiFi Dev. Board Demo 1:Demo 2:Want the best of 5G WiFi connectivity while still maintain super compact form factor? ...
- The Fuxk Button | Best Desktop Tool for Linux Beginners What is this project about?When you just got started with Linux, there are a lot of commands to lear...
- DIY IoT Terminal Demo VideoIntroductionA dedicated IoT terminal can be really useful sometimes for displaying your Io...
-
-
-
-
-
-
3D printed Enclosure Backplate for Riden RD60xx power supplies
154 1 1 -
-