https://www.waveshare.com/w/index.php?title=Special:NewPages&feed=atom&hideredirs=1&limit=50&offset=&namespace=0&username=&tagfilter=Waveshare Wiki - New pages [en]2024-03-29T11:55:09ZFrom Waveshare WikiMediaWiki 1.35.5https://www.waveshare.com/wiki/8inch_768x1024_LCD8inch 768x1024 LCD2024-03-29T10:25:19Z<p>Eng52: Created page with "<div class="wiki-pages jet-green-color"> {{Infobox item|colorscheme=green |img=File:8inch 768x1024 LCD.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#url..."</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|img=[[File:8inch 768x1024 LCD.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/8inch-768x1024-lcd.htm}}]]<br />
|caption=<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}</div>Eng52https://www.waveshare.com/wiki/USB_TO_4CH_RS232USB TO 4CH RS2322024-03-27T07:47:16Z<p>Eng52: /* Features */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|name=RS232 D89 Male Port<br />
|name2=RS232 D89 Female Port<br />
|img=[[File:USB TO 4CH RS232.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/usb-to-4ch-rs232.htm?sku=26854}}]]<br />
|img2=[[File:USB TO 4CH RS232-2.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/usb-to-4ch-rs232.htm?sku=27022}}]]<br />
|caption2=RS232<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
=Overview=<br />
USB TO 4CH RS232 is an industrial USB to RS232 isolated converter, featuring the original FT4232HL. It has onboard protection circuits such as built-in power isolation, ADI magnetical isolation, and TVS diode. With an aluminum alloy case, USB TO 4CH RS232 offers simple operation, zero-delay automatic transmission and reception conversion, and boasts characteristics such as fast, stable, reliable, and safe communication, suitable for industrial control applications with various communication requirements.<br />
==Features==<br />
*Adopts original FT4232HL chip, fast communicating, stable and reliable, better compatibility.<br />
*Supports USB to 4-ch isolated RS232, convenient for expanding multiple RS232 industrial serial devices.<br />
*Onboard unibody power supply isolation, provides stable isolated voltage.<br />
*Onboard unibody digital isolation, allows signal isolation, high reliability, strong anti-interference, low power consumption.<br />
*Onboard TVS (Transient Voltage Suppressor), effectively suppress surge voltage and transient spike voltage in the circuit, lightningproof & ESD protection.<br />
*Onboard self-recovery fuse and protection diodes, ensure the current/voltage stable outputs, provide over-current/over-voltage proof, improve shock proof performance.<br />
*Onboard power supply screw terminal, allows 5V~36V DC wide range input.<br />
*9x LEDs for indicating the power and transceiver status.<br />
*Aluminum alloy enclosure with sand blasting and anodic oxidation, solid and durable.<br />
<br />
==Version Options==<br />
[[File:USB TO 4CH RS232 Ver.png]]<br />
<br />
==Parameters==<br />
{| class="wikitable"<br />
|-<br />
| style="width:155px;"| Product Types<br />
| colspan="2" rowspan="1"|Industrial isolated USB to RS232 converter<br />
|-<br />
| colspan="1" rowspan="1"| Host Interface<br />
| colspan="2" rowspan="1"|USB<br />
|-<br />
| colspan="1" rowspan="1"| Device Interface<br />
| colspan="2" rowspan="1"|RS232<br />
|-<br />
| colspan="1" rowspan="1"| Communication Range<br />
| colspan="2" rowspan="1"|300bps ~ 921600bps<br />
|-<br />
| colspan="1" rowspan="4"| USB Interface<br />
| style="width:79px;"| Operating Level<br />
| colspan="1" rowspan="1"|5V<br />
|-<br />
|| Connector<br />
| colspan="1" rowspan="1"|USB-B <br />
|-<br />
|| Protection<br />
| colspan="1" rowspan="1"|200mA self-recovery fuse<br />
|-<br />
|| Transmission Distance<br />
| colspan="1" rowspan="1"|About 5m<br />
|-<br />
| colspan="1" rowspan="2"| Power Port<br />
|| Supply Voltage <br />
| colspan="1" rowspan="1"|5 ~ 36V DC power screw terminal<br />
|-<br />
|| Protection<br />
| colspan="1" rowspan="1"|Anti-reverse<br />
|-<br />
| colspan="1" rowspan="4"| RS232 <br />
|| Connector<br />
| colspan="1" rowspan="1"| DB9 male port / female port<br />
|-<br />
|| Protection<br />
| colspan="1" rowspan="1"| TVS diode protection, anti-surge and ESD protection <br />
|-<br />
|| Transmission Distance<br />
| colspan="1" rowspan="1"| About 15m<br />
|-<br />
|| Transmission Mode<br />
| colspan="1" rowspan="1"|Point to Point<br />
|-<br />
| colspan="1" rowspan="3"| Indicator<br />
|| PWR<br />
| colspan="1" rowspan="1"|Red power indicator, lights up when the voltage is detected <br />
|-<br />
|| TXD<br />
| colspan="1" rowspan="1"|Green Transmitting indicator, lights up when data is sent from the corresponding port <br />
|-<br />
|| RXD<br />
| colspan="1" rowspan="1"|Blue Receiving indicator, data sent back from the corresponding port <br />
|-<br />
| colspan="1" rowspan="2"| Operating Environment<br />
|| Temperature Range<br />
| colspan="1" rowspan="1"|-40℃ ~ 85℃<br />
|-<br />
|| Humidity Range<br />
| colspan="1" rowspan="1"|5% ~ 95%RH<br />
|-<br />
| colspan="1" rowspan="1"| Operating System<br />
| colspan="2" rowspan="1"|Mac, Linux, Android, Windows 11 / 10 / 8.1 / 8 / 7<br />
|}<br /><br />
<br />
==Onboard Interface==<br />
[[File:USB TO 4CH RS232 Inter.png]]<br />
<br />
==Dimensions==<br />
[[File:USB TO 4CH RS232 Dem.jpg]]<br />
<br />
=Driver Installation=<br />
==USB Driver Installation==<br />
* Download the driver file [https://files.waveshare.com/wiki/USB-TO-RS232-485-%EF%BC%88B)/CDM212364_Setup.zip VCP Driver] <br /><br />
* Double click on '''CDM212364_Setup.exe''' and install it.<br /><br />
* Click on '''Extract''', and then "NEXT". <br /><br />
[[File: USB TO 4CH Serial Converter -03.png]]<br />
* Click on '''I accept this agreement''', and then click on "NEXT", and click on "Finish". <br /><br />
[[File: USB TO 4CH Serial Converter -04.png]]<br />
* After connecting the PC, you can see the usable COM port number in the device manager. <br /><br />
[[File: USB TO 4CH Serial Converter-05.png|250px]]<br /><br />
<br />
=Communication Operation=<br />
==Preparation==<br />
* Open the SSCOM software.<br /><br />
* Select the corresponding COM port according to the functions, the identifiable 4 COM port number is Port A to Port D in descending order.<br /><br />
<br />
{|border=1; style="width:600px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;"|USB TO 4CH RS232 - PORT<br />
|style="background:green; color:white;text-align:center;"|Communication Mode<br />
|-<br />
|style="text-align:center;"|PORT A<br />
|style="text-align:center;"|RS232<br />
|-<br />
|style="text-align:center;"|PORT B<br />
|style="text-align:center;"|RS232<br />
|-<br />
|style="text-align:center;"|PORT C<br />
|style="text-align:center;"|RS232<br />
|-<br />
|style="text-align:center;"|PORT D<br />
|style="text-align:center;"|RS232<br />
|}<br />
* You can check which PORT port is the corresponding COM port through the Device Manager.<br /><br />
* Right-click to check the PORT corresponding to the COM9 port.<br /><br />
[[File: USB TO 4CH Serial Converter Environment 1.png|600px]]<br /><br />
* And you can see the corresponding PORT D to the COM8. <br /><br />
[[File: USB TO 4CH Serial Converter Environment 2.png|600px]]<br /><br />
<br />
==RS232 Communication==<br />
The following demonstrates communication between RS232 of PORT D and RS232 of PORT C using the product.<br /><br />
<br />
===Hardware Communication===<br />
{|border=1; style="width:600px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;"|USB TO 4CH RS232 - PORT A<br />
|style="background:green; color:white;text-align:center;"|USB TO 4CH RS232 - PORT B<br />
|-<br />
|style="text-align:center;"|PORT A - RXD<br />
|style="text-align:center;"|PORT B - TXD<br />
|-<br />
|style="text-align:center;"|PORT A - TXD<br />
|style="text-align:center;"|PORT B - RXD<br />
|-<br />
|style="text-align:center;"|PORT A - GND<br />
|style="text-align:center;"|PORT B - GND<br />
|}<br />
<br />
===Software Operation===<br />
* Open 2x SSCOM windows.<br /><br />
* Respectively select the COM port corresponding to Port A and Port B. <br /><br />
[[File: USB TO 4CH Serial Converter 15.png]]<br /><br />
* Select the baudrate as '''115200''', input the characters to be send, select '''Add time stamp and package''', click on '''Open COM'''.<br /><br />
[[File: USB TO 4CH Serial Converter 16.png|600px]]<br /><br />
*Both select the '''100ms''' of the SSCOM, and you can see these two windows normally transmit and receive, and the test effect is shown below: <br /><br />
[[File: USB TO 4CH Serial Converter 11.png]]<br /><br />
<br />
=Resource=<br />
==Datasheet==<br />
* [https://files.waveshare.com/wiki/USB-TO-4CH-RS232-485/DS_FT4232H-Datasheet.pdf FT4232H-Datasheet]<br /><br />
<br />
==Software & Driver==<br />
* [https://files.waveshare.com/wiki/USB-TO-4CH-RS232-485/CDM212364_Setup.zip VCP driver] ( or you can download it from [https://ftdichip.com/drivers/vcp-drivers/ FTDI website] )<br /><br />
* Support Linux driver free.<br /><br />
* [https://www.waveshare.com/w/upload/2/20/Cktszsss32.zip SSCOM]<br /><br />
=FAQ=<br />
{{FAQ|Only 1 COM port can be recognized when the device is connected to the computer (if the computer normally recognizes 4 COM ports, please do not do the following).<br />
|<br />
*It may be caused by a corrupt registry, and you can address it by performing the following steps:<br />
1. Download [https://files.waveshare.com/wiki/USB-TO-4CH-Serial-Converter/COMDB.zip the registry-related demo] and run it.<br/><br />
[[File: USB TO 4CH Serial Converter FAQ-01.png]]<br/>[[File: USB TO 4CH Serial Converter FAQ-00.png]]<br/><br />
2. If there is a pop-up window, please allow the demo to run.<br/><br />
3. Download [https://files.waveshare.com/wiki/USB-TO-4CH-Serial-Converter/CDMUninstaller_v1.4.zip Configuration Software] to select ''' CDMUninstaller.exe''' and open. '''(If CDMUninstaller.exe can not successfully run, please use CDMuninstallerGUI.exe.)'''<br/><br />
[[File: USB TO 4CH Serial Converter FAQ-02.png]]<br/><br />
4. Configure and execute (please erase the value of "Product ID" -> Click "Add" -> ③ -> Click "Remove Devices" -> OK) <br/><br />
[[File: USB TO 4CH Serial Converter FAQ-03.png]]<br/><br />
5. If the ''' CDMUninstaller.exe''' can not successfully run (as the following shows), please use ''' CDMuninstallerGUI.exe'''. <br/><br />
[[File: USB TO 4CH Serial Converter FAQ-04.png]]<br/><br />
||}}<br />
{{FAQ|WIN7 driver installation failed?<br />
|<br />
Install this driver:<br><br />
[https://www.waveshare.com/w/upload/0/0c/CDM_v2.08.30_WHQL_Certified.zip FT232-WIN7-Driver]<br />
||}}<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/Tutorial_III:_Motor_With_Encoder_Control_Demo_3Tutorial III: Motor With Encoder Control Demo 32024-03-25T06:20:52Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
==Modules Usage Tutorial==<br />
*[[How To Install Arduino IDE]]<br />
*[[Tutorial I: Motor With Encoder Control Demo]]<br />
*[[Tutorial II: Motor With Encoder Control Demo 2]]<br />
*[[Tutorial III: Motor With Encoder Control Demo 3]]<br />
*[[Tutorial II: Motor Without Encoder Control Demo|Tutorial IV: Motor Without Encoder Control Demo]]<br />
*[[Tutorial III: ST3215 Serial Bus Servo Control Demo|Tutorial V: ST3215 Serial Bus Servo Control Demo ]]<br />
*[[Tutorial IV: PWM Servo Control Demo|Tutorial VI: PWM Servo Control Demo]]<br />
*[[Tutorial V: IMU Data Reading Demo|Tutorial VII: IMU Data Reading Demo]]<br />
*[[Tutorial VI: SD Card Reading Demo| Tutorial VIII: SD Card Reading Demo]]<br />
*[[Tutorial VII: INA219 Voltage And Current Monitoring Demo|Tutorial IX: INA219 Voltage And Current Monitoring Demo]]<br />
*[[Tutorial VIII: OLED Screen Control Demo|Tutorial X: OLED Screen Control Demo]]<br />
*[[Tutorial IX Lidar and Publishing Lidar Topics in ROS2|Tutorial XI Lidar and Publishing Lidar Topics in ROS2 ]]<br />
*[[General Driver for Robots|General Driver for Robots WIKI Main Page]]<br />
==Motor With Encoder Control Demo 3==<br />
This tutorial integrates the functionalities of Demo 1 and Demo 2, aiming for closed-loop control of motor speed and inputting the target rotation speed.<br />
==Demo==<br />
===Upload Demo===<br />
After downloading the motorCtrl.ino, use the USB cable to connect the multifunctional driver board and the computer (here inserted into the USB Type-C port of the multifunctional driver board), click on "Tools" → "Ports", and then click on the newly appeared COM port.<br><br />
[[File: UGV1 doenload03EN.png|500px]]<br><br />
In Arduino IDE, click "Tools" → "Development Board" → "ESP32" → "ESP32 Dev Module", select the development board and the port and then upload the demo. After uploading the demo, connect the motor interface PH2.0 2P on the motor driver board to the motor without an encoder. Connect the XH2.54 power port to the power supply. Upon doing this, you will observe the motor rapidly rotating in the positive direction for 3 seconds, then slowly rotating in the opposite direction for 3 seconds, followed by a pause of 3 seconds, all in a continuous loop.<br />
<br />
===Demo Analysis===<br />
<syntaxhighlight lang="cpp"><br />
// --- --- --- Encoder Part --- --- ---<br />
<br />
// Encoder A pin definition <br />
const uint16_t AENCA = 35; // Encoder A input A_C2(B)<br />
const uint16_t AENCB = 34; // Encoder A input A_C1(A)<br />
<br />
// Encoder B pin definition <br />
const uint16_t BENCB = 16; // Encoder B input B_C2(B)<br />
const uint16_t BENCA = 27; // Encoder B input B_C1(A)<br />
<br />
//To calculate the number of transitions between high and low levels of a Hall sensor of the encoder within a given "interval" time (in milliseconds)<br />
//If you are using RISING edge detection for initializing interrupts, you'll specifically count the number of transitions from low to high level<br />
volatile long B_wheel_pulse_count = 0;<br />
volatile long A_wheel_pulse_count = 0;<br />
<br />
//To calculate the period for calculating speed, you need to specify the interval at which you want to compute the speed<br />
int interval = 100;<br />
<br />
//The current time <br />
long currentMillis = 0;<br />
<br />
// The reduction ratio of the motor means that the motor speed and the output shaft speed of the reduced motor are different.<br />
// For example, with the DCGM3865 motor, with a reduction ratio of 1:42, it means that for every 42 revolutions of the motor, the output shaft completes 1 revolution.<br />
// For each revolution of the output shaft, the motor needs to complete more revolutions as the reduction ratio increases. Typically, higher reduction ratios result in greater torque.<br />
// Take the DCGM3865 motor (reduction ratio: 1:42) as an example:<br />
double reduction_ratio = 42;<br />
<br />
//Number of encoder lines, one revolution of the motor, the number of high and low level changes of one Hall sensor of the encoder<br />
int ppr_num = 11;<br />
<br />
// When the output shaft completes one revolution, the number of transitions observed in a single Hall sensor<br />
double shaft_ppr = reduction_ratio * ppr_num;<br />
<br />
// The callback function of the interrupt function, refer to the attachInterrupt() function later<br />
void IRAM_ATTR B_wheel_pulse() {<br />
if(digitalRead(BENCA)){<br />
B_wheel_pulse_count++;<br />
}<br />
else{<br />
B_wheel_pulse_count--;<br />
}<br />
}<br />
<br />
void IRAM_ATTR A_wheel_pulse() {<br />
if(digitalRead(AENCA)){<br />
A_wheel_pulse_count++;<br />
}<br />
else{<br />
A_wheel_pulse_count--;<br />
}<br />
}<br />
// --- --- --- --- --- --- --- --- ---<br />
<br />
// --- --- --- Motor Part --- --- ---<br />
// The following defines how to control the ESP32 pin of TB6612<br />
// Motor A<br />
const uint16_t PWMA = 25; // Motor A PWM control Orange<br />
const uint16_t AIN2 = 17; // Motor A input 2 Brown<br />
const uint16_t AIN1 = 21; // Motor A input 1 Green<br />
<br />
// Motor B <br />
const uint16_t BIN1 = 22; // Motor B input 1 Yellow<br />
const uint16_t BIN2 = 23; // Motor B input 2 Purple<br />
const uint16_t PWMB = 26; // Motor B PWM control White<br />
<br />
// PWM frequency of pins used for PWM outputs<br />
int freq = 100000;<br />
<br />
// Define PWM channel<br />
int channel_A = 5;<br />
int channel_B = 6;<br />
<br />
// Defines PWM accuracy, when it is 8, the PWM value is 0-255 (2^8-1)<br />
const uint16_t ANALOG_WRITE_BITS = 8;<br />
// The maximum PWM value <br />
const uint16_t MAX_PWM = pow(2, ANALOG_WRITE_BITS)-1;<br />
// The minimum PWM, due to the poor low-speed characteristics of DC motors, may not reach the motor's rotation threshold<br />
const uint16_t MIN_PWM = MAX_PWM/5;<br />
<br />
// Motor A control <br />
void channel_A_Ctrl(float pwmInputA){<br />
// Round the pwmInput value to the nearest integer<br />
int pwmIntA = round(pwmInputA);<br />
<br />
// If pwmInput is 0, it stops rotating <br />
if(pwmIntA == 0){<br />
digitalWrite(AIN1, LOW);<br />
digitalWrite(AIN2, LOW);<br />
return;<br />
}<br />
<br />
// Determine the direction of rotation by determining the positive or negative pwmInput value<br />
if(pwmIntA > 0){<br />
digitalWrite(AIN1, LOW);<br />
digitalWrite(AIN2, HIGH);<br />
// constrain() function is for limiting the pwmIntA value between MIN_PWM and MAX_PWM <br />
ledcWrite(channel_A, constrain(pwmIntA, MIN_PWM, MAX_PWM));<br />
}<br />
else{<br />
digitalWrite(AIN1, HIGH);<br />
digitalWrite(AIN2, LOW);<br />
ledcWrite(channel_A,-constrain(pwmIntA, -MAX_PWM, 0));<br />
}<br />
}<br />
<br />
// Motor B control <br />
void channel_B_Ctrl(float pwmInputB){<br />
int pwmIntB = round(pwmInputB);<br />
if(pwmIntB == 0){<br />
digitalWrite(BIN1, LOW);<br />
digitalWrite(BIN2, LOW);<br />
return;<br />
}<br />
<br />
if(pwmIntB > 0){<br />
digitalWrite(BIN1, LOW);<br />
digitalWrite(BIN2, HIGH);<br />
ledcWrite(channel_B, constrain(pwmIntB, 0, MAX_PWM));<br />
}<br />
else{<br />
digitalWrite(BIN1, HIGH);<br />
digitalWrite(BIN2, LOW);<br />
ledcWrite(channel_B,-constrain(pwmIntB, -MAX_PWM, 0));<br />
}<br />
}<br />
// --- --- --- --- --- --- --- --- ---<br />
<br />
// --- --- --- Closed-loop Control --- --- ---<br />
// PID Controller Parameters <br />
double Kp = 0.05; // Scale factor<br />
double Ki = 0.05; // Integral coefficient<br />
double Kd = 0; // Differentiation factor<br />
<br />
// Target rotation speed and the actual rotation speed <br />
double targetSpeed_A = 100.0; // Target rotation speed (Adjustable on request) <br />
double actualSpeed_A = 0.0; // The actual rotation speed <br />
double targetSpeed_B = 100.0; // The target rotation speed (Adjustable on request) <br />
double actualSpeed_B = 0.0; // The actual rotation speed <br />
<br />
// PID controller variables <br />
double previousError_A = 0.0;<br />
double integral_A = 0.0;<br />
double previousError_B = 0.0;<br />
double integral_B = 0.0;<br />
// --- --- --- --- --- --- --- --- ---<br />
<br />
void setup(){<br />
// Set up the working mode of the encoder's related pins <br />
pinMode(BENCB , INPUT_PULLUP);<br />
pinMode(BENCA , INPUT_PULLUP);<br />
<br />
pinMode(AENCB , INPUT_PULLUP);<br />
pinMode(AENCA , INPUT_PULLUP);<br />
<br />
// Set the interrupt and the corresponding callback function to call the B_wheel_pulse function when BEBCB changes from low to high (RISING)<br />
attachInterrupt(digitalPinToInterrupt(BENCB), B_wheel_pulse, RISING);<br />
// Set the interrupt and the corresponding callback function to call the A_wheel_pulse function when AEBCB changes from low to high (RISING) <br />
attachInterrupt(digitalPinToInterrupt(AENCB), A_wheel_pulse, RISING);<br />
<br />
//Initialize the serial port, and the baud rate is 115200<br />
Serial.begin(115200);<br />
// Wait for serial port initialization to complete <br />
while(!Serial){}<br />
<br />
// Setting the operating mode of the ESP32 pin used to control the TB6612FNG <br />
pinMode(AIN1, OUTPUT);<br />
pinMode(AIN2, OUTPUT);<br />
pinMode(PWMA, OUTPUT);<br />
pinMode(BIN1, OUTPUT);<br />
pinMode(BIN2, OUTPUT);<br />
pinMode(PWMB, OUTPUT);<br />
<br />
// Setting the channel, frequency, and accuracy of the ESP32 pin used for PWM outputs <br />
ledcSetup(channel_A, freq, ANALOG_WRITE_BITS);<br />
ledcAttachPin(PWMA, channel_A);<br />
<br />
ledcSetup(channel_B, freq, ANALOG_WRITE_BITS);<br />
ledcAttachPin(PWMB, channel_B);<br />
<br />
// The pin used to control rotation is placed at a low level, the motor stops rotating to avoid starting rotation immediately after initialization<br />
digitalWrite(AIN1, LOW);<br />
digitalWrite(AIN2, LOW);<br />
digitalWrite(BIN1, LOW);<br />
digitalWrite(BIN2, LOW);<br />
}<br />
<br />
void loop(){<br />
// Calculate the speed of the output shaft of the B-channel motor in revolutions per minute<br />
actualSpeed_B = (float)((B_wheel_pulse_count / shaft_ppr) * 60 * (1000 / interval));<br />
B_wheel_pulse_count = 0;<br />
<br />
// Calculate the speed of the output shaft of the A-channel motor in revolutions per minute <br />
actualSpeed_A = (float)((A_wheel_pulse_count / shaft_ppr) * 60 * (1000 / interval));<br />
A_wheel_pulse_count = 0;<br />
<br />
// Calculation error and control volume <br />
double error_A = targetSpeed_A - actualSpeed_A;<br />
integral_A += error_A;<br />
double derivative_A = error_A - previousError_A;<br />
<br />
double error_B = targetSpeed_B - actualSpeed_B;<br />
integral_B += error_B;<br />
double derivative_B = error_B - previousError_B;<br />
<br />
// Compute PID output <br />
double output_A = Kp * error_A + Ki * integral_A + Kd * derivative_A;<br />
double output_B = Kp * error_B + Ki * integral_B + Kd * derivative_B;<br />
<br />
// output_A += Kp * error_A;<br />
// output_B += Kp * error_B;<br />
<br />
// Limits the output range <br />
output_A = constrain(output_A, -MAX_PWM, MAX_PWM);<br />
output_B = constrain(output_B, -MAX_PWM, MAX_PWM);<br />
<br />
// Output the PWM signal, control the motor rotation speed <br />
channel_A_Ctrl(-output_A);<br />
channel_B_Ctrl(-output_B);<br />
<br />
// Update the last error value<br />
previousError_A = error_A;<br />
previousError_B = error_B;<br />
<br />
Serial.print("RPM_A: ");Serial.print(actualSpeed_A);Serial.print(" RPM_B: ");Serial.println(actualSpeed_B);<br />
Serial.println("--- --- ---");<br />
<br />
delay(interval);<br />
}<br />
</syntaxhighlight><br />
<br />
==Resource==<br />
*[https://files.waveshare.com/upload/d/dc/SpeedLoopCtrl.rar Demo]</div>Eng52https://www.waveshare.com/wiki/Tutorial_II:_Motor_With_Encoder_Control_Demo_2Tutorial II: Motor With Encoder Control Demo 22024-03-25T04:01:16Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
==Modules Usage Tutorial==<br />
*[[How To Install Arduino IDE]]<br />
*[[Tutorial I: Motor With Encoder Control Demo]]<br />
*[[Tutorial II: Motor With Encoder Control Demo 2]]<br />
*[[Tutorial III: Motor With Encoder Control Demo 3]]<br />
*[[Tutorial II: Motor Without Encoder Control Demo|Tutorial IV: Motor Without Encoder Control Demo]]<br />
*[[Tutorial III: ST3215 Serial Bus Servo Control Demo|Tutorial V: ST3215 Serial Bus Servo Control Demo ]]<br />
*[[Tutorial IV: PWM Servo Control Demo|Tutorial VI: PWM Servo Control Demo]]<br />
*[[Tutorial V: IMU Data Reading Demo|Tutorial VII: IMU Data Reading Demo]]<br />
*[[Tutorial VI: SD Card Reading Demo| Tutorial VIII: SD Card Reading Demo]]<br />
*[[Tutorial VII: INA219 Voltage And Current Monitoring Demo|Tutorial IX: INA219 Voltage And Current Monitoring Demo]]<br />
*[[Tutorial VIII: OLED Screen Control Demo|Tutorial X: OLED Screen Control Demo]]<br />
*[[Tutorial IX Lidar and Publishing Lidar Topics in ROS2|Tutorial XI Lidar and Publishing Lidar Topics in ROS2 ]]<br />
*[[General Driver for Robots|General Driver for Robots WIKI Main Page]]<br />
==Motor With Encoder Control Demo 2==<br />
This tutorial is for controlling the forward and reverse, fast and slow rotation of a motor, the following provides demos for controlling the rotation of a motor.<br />
==Demo==<br />
===Upload Demo===<br />
After downloading the motorCtrl.ino, use the USB cable to connect the multifunctional driver board and the computer (here inserted into the USB Type-C port of the multifunctional driver board), click on "Tools" → "Ports", and then click on the newly appeared COM port.<br><br />
[[File: UGV1 doenload03EN.png|500px]]<br><br />
In Arduino IDE, click "Tools" → "Development Board" → "ESP32" → "ESP32 Dev Module", select the development board and the port and then upload the demo. After uploading the demo, connect the motor interface PH2.0 2P on the motor driver board to the motor without an encoder. Connect the XH2.54 power port to the power supply. Upon doing this, you will observe the motor rapidly rotating in the positive direction for 3 seconds, then slowly rotating in the opposite direction for 3 seconds, followed by a pause of 3 seconds, all in a continuous loop.<br><br />
<br />
===Demo Analysis===<br />
<syntaxhighlight lang="cpp"><br />
// The following defines the ESP32 pins used to control the TB6612<br />
// Motor A<br />
const uint16_t PWMA = 25; // Motor A PWM control Orange<br />
const uint16_t AIN2 = 17; // Motor A input 2 Brown<br />
const uint16_t AIN1 = 21; // Motor A input 1 Green<br />
<br />
// Motor B<br />
const uint16_t BIN1 = 22; // Motor B input 1 Yellow<br />
const uint16_t BIN2 = 23; // Motor B input 2 Purple<br />
const uint16_t PWMB = 26; // Motor B PWM control White<br />
<br />
// PWM frequency of pins used for PWM outputs<br />
int freq = 100000;<br />
<br />
// Define PWM channel<br />
int channel_A = 5;<br />
int channel_B = 6;<br />
<br />
// Define PWM accuracy, when it is 8, and PWM value is 0-255(2^8-1)<br />
const uint16_t ANALOG_WRITE_BITS = 8;<br />
// The maximum PWM value<br />
const uint16_t MAX_PWM = pow(2, ANALOG_WRITE_BITS)-1;<br />
// The minimum PWM value, due to the poor low-speed characteristics of DC motors, may not reach the motor's rotation threshold.<br />
const uint16_t MIN_PWM = MAX_PWM/5;<br />
<br />
<br />
void setup(){<br />
// Setting the operating mode of the ESP32 pin used to control the TB6612FNG<br />
pinMode(AIN1, OUTPUT);<br />
pinMode(AIN2, OUTPUT);<br />
pinMode(PWMA, OUTPUT);<br />
pinMode(BIN1, OUTPUT);<br />
pinMode(BIN2, OUTPUT);<br />
pinMode(PWMB, OUTPUT);<br />
<br />
// Setting the channel, frequency, and accuracy of the ESP32 pin used for PWM outputs <br />
ledcSetup(channel_A, freq, ANALOG_WRITE_BITS);<br />
ledcAttachPin(PWMA, channel_A);<br />
<br />
ledcSetup(channel_B, freq, ANALOG_WRITE_BITS);<br />
ledcAttachPin(PWMB, channel_B);<br />
<br />
// The pin used to control rotation should be set to a low logic level to stop the motor from rotating, thereby avoiding immediate rotation upon initialization<br />
digitalWrite(AIN1, LOW);<br />
digitalWrite(AIN2, LOW);<br />
digitalWrite(BIN1, LOW);<br />
digitalWrite(BIN2, LOW);<br />
}<br />
<br />
<br />
// Motor A control <br />
void channel_A_Ctrl(float pwmInputA){<br />
// Round the pwmInput value to the nearest integer<br />
int pwmIntA = round(pwmInputA);<br />
if(pwmIntA == 0){<br />
digitalWrite(AIN1, LOW);<br />
digitalWrite(AIN2, LOW);<br />
return;<br />
}<br />
<br />
// Determine the direction of rotation by determining the positive or negative pwmInput value <br />
if(pwmIntA > 0){<br />
digitalWrite(AIN1, LOW);<br />
digitalWrite(AIN2, HIGH);<br />
// constrain() function is for limiting the pwmIntA value between MIN_PWM and MAX_PWM<br />
ledcWrite(channel_A, constrain(pwmIntA, MIN_PWM, MAX_PWM));<br />
}<br />
else{<br />
digitalWrite(AIN1, HIGH);<br />
digitalWrite(AIN2, LOW);<br />
ledcWrite(channel_A,-constrain(pwmIntA, -MAX_PWM, -MIN_PWM));<br />
}<br />
}<br />
<br />
// Motor B control <br />
void channel_B_Ctrl(float pwmInputB){<br />
int pwmIntB = round(pwmInputB);<br />
if(pwmIntB == 0){<br />
digitalWrite(BIN1, LOW);<br />
digitalWrite(BIN2, LOW);<br />
return;<br />
}<br />
<br />
if(pwmIntB > 0){<br />
digitalWrite(BIN1, LOW);<br />
digitalWrite(BIN2, HIGH);<br />
ledcWrite(channel_B, constrain(pwmIntB, MIN_PWM, MAX_PWM));<br />
}<br />
else{<br />
digitalWrite(BIN1, HIGH);<br />
digitalWrite(BIN2, LOW);<br />
ledcWrite(channel_B,-constrain(pwmIntB, -MAX_PWM, -MIN_PWM));<br />
}<br />
}<br />
<br />
<br />
void loop(){<br />
// Motor stops for 3 seconds <br />
channel_A_Ctrl(0);<br />
channel_B_Ctrl(0);<br />
delay(3000);<br />
<br />
// Motor reverses direction and turns at low speed for <br />
channel_A_Ctrl(-64);<br />
channel_B_Ctrl(-64);<br />
delay(3000);<br />
<br />
// Motor positive direction and turns at high speed for 3 seconds <br />
channel_A_Ctrl(255);<br />
channel_B_Ctrl(255);<br />
delay(3000);<br />
}<br />
</syntaxhighlight><br />
<br />
==Resource==<br />
*[https://files.waveshare.com/upload/d/dc/MotorCtrl.rar Demo]</div>Eng52https://www.waveshare.com/wiki/30_OpenCV_Color_Tracking30 OpenCV Color Tracking2024-03-22T04:00:06Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
='''OpenCV Color Tracking'''=<br />
In this chapter, we add some functions to control peripheral interfaces in OpenCV. For example, the camera's pan-tilt will move, and please keep your hands or other fragile objects away from its rotation radius.<br />
=='''Preparation'''==<br />
As the product will run the main demo by default, and the main demo will occupy the camera resources, in this case, this tutorial is not applicable. Please terminate the main demo or reboot the robot after disabling the auto-running of the main demo.<br />
<br />
It's worth noting that because the robot's main demo uses multi-threading and is configured to run automatically at startup through crontab, the usual method "sudo killall python" typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
<br />
If you have disabled the boot autorun of the robot's main program, you do not need to execute the '''Terminate Main Demo''' section below.<br />
<br />
==='''Terminate Main Demo'''===<br />
1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
<br />
2. Click on "Terminal" in Other, and open the terminal window. <br />
<br />
3. Input '''bash''' in the terminal window and press Enter. <br />
<br />
4. Now you can use "Bash Shell" to control the robot. <br />
<br />
5. Input the command: '''crontab -e'''<br />
<br />
6. If you are asked what the editor to be used, you can input '''1''' and then press Enter, select to use "Nano". <br />
<br />
7. Open "crontab" config file, and you can see: <br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
<br />
8. Add '''#''' in front of '''……app.py >> ……''' to comment out this line. <br />
<syntaxhighlight lang="python"><br />
# @reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
<br />
9. On the terminal interface, press "Ctrl + X" to exit, and it will query '''Save modified buffer?''', and press '''Y''' and Enter to save the modification.<br />
<br />
10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
<br />
11. One thing to note is that since the lower machine continues to communicate with the host through the serial port, the host may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the host is a Raspberry Pi, after the Raspberry Pi power down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
<br />
12. Input the command to reboot: '''sudo reboot'''<br />
<br />
13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
=='''Demo'''==<br />
Directly run the following demo: <br />
<br />
1. Choose the following demo: <br />
<br />
2. Run it by Shift + Enter. <br />
<br />
3. View the real-time video window: <br />
<br />
4. Press '''STOP''' to stop the real-time video and release the camera resources. <br />
<br />
==='''If you cannot see the real-time camera feed when running:'''===<br />
*Click on Kernel -> Shut down all kernels above. <br />
*Close the current section tab and open it again. <br />
*Click '''STOP''' to release the camera resources, then run the code block again. <br />
*Reboot the device.<br />
<br />
==='''Run the Demo'''===<br />
In this chapter of the tutorial, the camera pan-tilt will rotate, make sure your hands or other fragile objects are away from the rotation radius of the camera pan-tilt.<br><br />
We detect the blue ball by default in the demo, please make sure there are no blue objects in the background of the screen to affect the color recognition function, you can also change the detection color (HSV color space) through secondary development.<br />
<br />
<syntaxhighlight lang="python"><br />
import matplotlib.pyplot as plt<br />
import cv2<br />
from picamera2 import Picamera2<br />
import numpy as np<br />
from IPython.display import display, Image<br />
import ipywidgets as widgets<br />
import threading<br />
<br />
# Stop button<br />
# ================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
<br />
def gimbal_track(fx, fy, gx, gy, iterate):<br />
global gimbal_x, gimbal_y<br />
distance = math.sqrt((fx - gx) ** 2 + (gy - fy) ** 2)<br />
gimbal_x += (gx - fx) * iterate<br />
gimbal_y += (fy - gy) * iterate<br />
if gimbal_x > 180:<br />
gimbal_x = 180<br />
elif gimbal_x < -180:<br />
gimbal_x = -180<br />
if gimbal_y > 90:<br />
gimbal_y = 90<br />
elif gimbal_y < -30:<br />
gimbal_y = -30<br />
gimbal_spd = int(distance * track_spd_rate)<br />
gimbal_acc = int(distance * track_acc_rate)<br />
if gimbal_acc < 1:<br />
gimbal_acc = 1<br />
if gimbal_spd < 1:<br />
gimbal_spd = 1<br />
base.base_json_ctrl({"T":self.CMD_GIMBAL,"X":gimbal_x,"Y":gimbal_y,"SPD":gimbal_spd,"ACC":gimbal_acc})<br />
return distance<br />
<br />
<br />
# Display function<br />
# ================<br />
def view(button):<br />
picam2 = Picamera2()<br />
picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)}))<br />
picam2.start()<br />
display_handle=display(None, display_id=True)<br />
<br />
color_upper = np.array([120, 255, 220])<br />
color_lower = np.array([ 90, 120, 90])<br />
min_radius = 12<br />
track_color_iterate = 0.023<br />
<br />
while True:<br />
frame = picam2.capture_array()<br />
# frame = cv2.flip(frame, 1) # if your camera reverses your image<br />
<br />
# uncomment this line if you are using USB camera<br />
# frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)<br />
blurred = cv2.GaussianBlur(img, (11, 11), 0)<br />
hsv = cv2.cvtColor(blurred, cv2.COLOR_BGR2HSV)<br />
mask = cv2.inRange(hsv, color_lower, color_upper)<br />
mask = cv2.erode(mask, None, iterations=5)<br />
mask = cv2.dilate(mask, None, iterations=5)<br />
<br />
cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL,<br />
cv2.CHAIN_APPROX_SIMPLE)<br />
cnts = imutils.grab_contours(cnts)<br />
center = None<br />
<br />
height, width = img.shape[:2]<br />
center_x, center_y = width // 2, height // 2<br />
<br />
if len(cnts) > 0:<br />
# find the largest contour in the mask, then use<br />
# it to compute the minimum enclosing circle and<br />
# centroid<br />
c = max(cnts, key=cv2.contourArea)<br />
((x, y), radius) = cv2.minEnclosingCircle(c)<br />
M = cv2.moments(c)<br />
center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))<br />
<br />
# only proceed if the radius meets a minimum size<br />
if radius > min_radius:<br />
distance = gimbal_track(center_x, center_y, center[0], center[1], track_color_iterate) #<br />
cv2.circle(overlay_buffer, (int(x), int(y)), int(radius), (128, 255, 255), 1)<br />
<br />
<br />
_, frame = cv2.imencode('.jpeg', frame)<br />
display_handle.update(Image(data=frame.tobytes()))<br />
if stopButton.value==True:<br />
picam2.close()<br />
display_handle.update(None)<br />
<br />
<br />
# Run<br />
# ================<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start()<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/29_Web_Command_Line_Application29 Web Command Line Application2024-03-22T03:25:32Z<p>Eng52: Created page with "<div class="wiki-pages jet-green-color"> To make the configuration of product parameters easier and allow users to add custom functionalities conveniently, we have designed co..."</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
To make the configuration of product parameters easier and allow users to add custom functionalities conveniently, we have designed command-line parameter functionality for the product. You can achieve corresponding functionalities by inputting commands into the command-line tool on the web page. This chapter will detail these functionalities.<br />
== Sending Information via ESP-NOW==<br />
<br />
=== Add Broadcast MAC Address to Peer===<br />
*send -a -b<br />
*send --add --broadcast<br />
*send -a FF:FF:FF:FF:FF:FF<br />
*send --add FF:FF:FF:FF:FF:FF<br />
<br />
=== Add Specific MAC Address to Peer===<br />
* send -a AA:BB:CC:DD:EE:FF<br />
*send -add AA:BB:CC:DD:EE:FF<br />
<br />
===Remove Broadcast MAC Address from Peer===<br />
*send -rm -b<br />
*send --remove --boardcast<br />
*send -rm FF:FF:FF:FF:FF:FF<br />
*send --remove FF:FF:FF:FF:FF:FF<br />
<br />
===Remove Specific MAC Address from Peer===<br />
* send -rm AA:BB:CC:DD:EE:FF<br />
*send --remove AA:BB:CC:DD:EE:FF<br />
<br />
=== Broadcast Sending (Broadcast MAC should be added to peer before first use)===<br />
*send -b what's up bro<br />
* send --broadcast what's up bro<br />
<br />
===Unicast Sending (Target MAC should be added to peer before first use)===<br />
*send AA:BB:CC:DD:EE:FF what's up bro<br />
<br />
===Multicast Sending (Target MAC should be added to peer before first use, multiple targets can be added, and cannot include broadcast MAC address: FF:FF:FF:FF:FF:FF)===<br />
*send -g what's up bro<br />
*send --group what's up bro<br />
<br />
==Audio Playback==<br />
===Text-to-Speech (TTS, initialization takes longer for the first use)===<br />
*audio -s what's up bro<br />
* audio --say what's up bro<br />
<br />
===Set Volume Level (Range between 0 and 1.0)===<br />
*audio -v 0.9<br />
*audio --volume 0.9<br />
<br />
===Play Audio File from the 'sounds' Directory (.mp3 .wav formats, can play files from other directories)===<br />
*audio -p file.mp3<br />
*audio --play_file file.mp3<br />
* audio -p others/file.mp3<br />
==Chassis==<br />
=== Send JSON Commands Directly to Chassis, Specific commands should be referred to related WIKI===<br />
* base -c {'T':1,'L':0,'R':0}<br />
*base --cmd {'T':1,'L':0,'R':0}<br />
=== Enable Displaying Information from Chassis on Screen===<br />
* base -r on<br />
*base --recv on<br />
=== Disable Displaying Information from Chassis on Screen===<br />
*base -r off<br />
* base --recv off<br />
===Default Display Time for Command Line/ESP-NOW Information is 10 seconds===<br />
* info<br />
==OpenCV==<br />
===Set Target Color Range===<br />
*cv -r [90,120,90] [120,255,200]<br />
*cv -range [90,120,90] [120,255,200]<br />
<br />
=== Select Color (Default colors are red, green, and blue only)===<br />
*cv -s red<br />
* cv --select red<br />
<br />
===Set Gimbal Tracking Parameters===<br />
====Color Tracking Iteration Ratio====<br />
*track -c 0.023<br />
====Face/Gesture Tracking Iteration Ratio====<br />
* track -f 0.068<br />
==== Tracking Speed====<br />
*track -s 60<br />
====Action Acceleration Ratio====<br />
*track -a 0.4</div>Eng52https://www.waveshare.com/wiki/28_Custom_Command_Line_Functionality28 Custom Command Line Functionality2024-03-22T03:21:13Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
To facilitate the secondary development of the product, we have added a command-line input window in the WEB application. You can input commands in this window, and after clicking the SEND button, the command will be sent to the upper computer application. The upper computer application will execute corresponding functionalities or parameter adjustments based on the command you send.<br><br />
We already have some ready-made commands that you can refer to in the following sections of the WEB Command Line Application to learn about those commands. In this section, we will introduce how to implement custom command-line functionality while explaining how this feature is implemented, making it easier for you to understand the subsequent sections.<br />
<br />
== Adding Functionality==<br />
The example demo for command-line functionality is written in the main program robot_ctrl.py, and they are handled by the cmd_process() function. Below is our default command-line instruction processing function. This function is incomplete because the content afterward deals with other functionalities, which are omitted here without affecting the understanding of the function itself.<br><br />
Note: The code block below cannot be executed in JupyterLab and is only used for illustration purposes.<br />
<syntaxhighlight lang="python"><br />
def cmd_process(self, args_str):<br />
global show_recv_flag, show_info_flag, info_update_time, mission_flag<br />
global track_color_iterate, track_faces_iterate, track_spd_rate, track_acc_rate<br />
# Split the input parameter string into a list: args<br />
args = args_str.split()<br />
if args[0] == 'base':<br />
self.info_update("CMD:" + args_str, (0,255,255), 0.36)<br />
if args[1] == '-c' or args[1] == '--cmd':<br />
base.base_json_ctrl(json.loads(args[2]))<br />
elif args[1] == '-r' or args[1] == '--recv':<br />
if args[2] == 'on':<br />
show_recv_flag = True<br />
elif args[2] == 'off':<br />
show_recv_flag = False<br />
<br />
elif args[0] == 'info':<br />
info_update_time = time.time()<br />
show_info_flag = True<br />
<br />
elif args[0] == 'audio':<br />
self.info_update("CMD:" + args_str, (0,255,255), 0.36)<br />
if args[1] == '-s' or args[1] == '--say':<br />
audio_ctrl.play_speech_thread(' '.join(args[2:]))<br />
elif args[1] == '-v' or args[1] == '--volume':<br />
audio_ctrl.set_audio_volume(args[2])<br />
elif args[1] == '-p' or args[1] == '--play_file':<br />
audio_ctrl.play_file(args[2])<br />
</syntaxhighlight><br />
Let's take '''audio -s hey hi hello''' as an example. This command is used for text-to-speech functionality, where audio represents an audio-related function, `-s` or `--say` indicates text-to-speech, and the following parameters are the content you want it to say. After sending this command, the robot will say "hey hi hello".<br><br />
Firstly, when this function receives a command-line instruction since the command-line instruction is a string, we need to use '''args = args_str.split()''' to convert this string into a list. Then, we can check each value in the list to execute the corresponding functionality. If you need to extend other custom functionalities, you just need to add another '''elif args[0] == 'newCmd''''.</div>Eng52https://www.waveshare.com/wiki/27_Crontab_Automatic_Startup_Script27 Crontab Automatic Startup Script2024-03-22T02:54:45Z<p>Eng52: Created page with "<div class="wiki-pages jet-green-color"> In previous tutorials, we briefly introduced how to disable the automatic startup of the product's main program. The method used is to..."</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
In previous tutorials, we briefly introduced how to disable the automatic startup of the product's main program. The method used is to comment out the command to run the product's main program in the Crontab file. In this chapter, you will learn more about Crontab and why we use Crontab instead of Services for automatic startup.<br><br />
Crontab is a tool in the Linux system used for scheduling periodic tasks. Through Crontab, users can set specific times, dates, or intervals to execute specific commands or scripts. Here are some important concepts and usages of Crontab:<br />
== Crontab File==<br />
The Crontab file is where scheduling information for periodic tasks is stored. Each user has their own Crontab file for storing their own scheduling information. Crontab files are usually stored in the /var/spool/cron directory, named after the user's username.<br />
==Crontab Format==<br />
Each line in the Crontab file represents a scheduled task. Each line consists of five fields representing minutes, hours, date, month, and day of the week. You can use the # symbol to comment out a line to disable the corresponding task scheduling.<br />
== Usage==<br />
To edit the Crontab file, you can use the crontab -e command. This command opens a text editor, allowing the user to edit their Crontab file. After editing, save and exit the editor, and the Crontab file will be updated.<br />
Common options:<br />
* -e: Edit the user's Crontab file.<br />
* -l: List the contents of the user's Crontab file.<br />
* -r: Remove the user's Crontab file.<br />
* -u: Specify the user to operate on.<br />
<br />
== Comparison with Services==<br />
<br />
In contrast, using services to achieve automatic startup at boot time is typically done by executing a series of predefined services or scripts when the system starts up. These services can be found in the /etc/init.d/ directory and started, stopped, or restarted using system service management tools like systemctl.<br />
<br />
== Advantages of Crontab:==<br />
:Flexibility: Tasks can be scheduled very flexibly, including specifying minutes, hours, dates, months, and days of the week.<br />
:Simplicity: Crontab configuration is relatively simple, making it convenient for scheduling simple periodic tasks.<br />
:User Independence: Each user has their own Crontab file, allowing them to manage their own tasks without affecting other users.<br />
<br />
== Advantages of Services:==<br />
<br />
* Reliability: Services implemented through services are often more stable and reliable as they are system-level services that are automatically loaded and run when the system starts.<br />
*Management: System administrators can more easily manage services, including starting, stopping, restarting, and checking status.<br />
* Control Permissions: For tasks requiring privileged execution, using services provides better control over permissions to ensure security.<br />
<br />
==Special Advantages of Crontab in this Product==<br />
Lower resource usage: Through our testing and comparison, the CPU resource usage of the same Python script using Crontab is 1/4 of that using services. For applications like complex robot main programs, using Crontab for automatic startup is a better choice. Services are more suitable for important services or applications that need to be executed at system startup. to be executed at system startup.</div>Eng52https://www.waveshare.com/wiki/26_YAML_Configuration_File_Settings26 YAML Configuration File Settings2024-03-22T02:51:48Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
== What is YAML?==<br />
YAML (YAML Ain't Markup Language) is a human-readable data serialization format used to represent complex data structures. Its main purposes include configuration files, data exchange and storage, and passing data to programs.<br />
<br />
==Advantages of YAML==<br />
=== Human Readability===<br />
YAML uses indentation and a well-structured format, making files easy to read and understand. It is less verbose compared to XML or JSON and closer to natural language.<br />
===Ease of Writing and Editing===<br />
YAML syntax is clear and concise, requiring no additional markup symbols (such as XML tags or JSON braces), making it easier to write and edit.<br />
=== Support for Complex Data Structures===<br />
YAML supports nesting, lists, dictionaries, and other complex data structures, making it easy to represent various types of data.<br />
<br />
=== Extensibility===<br />
YAML allows the use of tags and anchors to represent relationships between objects, enabling data reuse and referencing, thereby enhancing extensibility.<br />
=== Language Independence===<br />
YAML is a universal data serialization format that is not tied to any specific programming language, making it easy to parse and generate using multiple programming languages.<br />
<br />
== Configuration File config.yaml for this Product==<br />
In the config.yaml of this product, we configure some key parameters related to the robot:<br />
<br />
===audio_config - Audio-related settings===<br />
*audio_output: Whether to use audio output<br />
*default_volume: Default volume level<br />
*min_time_between_play: Minimum interval between audio playback<br />
*speed_rate: TTS speech rate<br />
<br />
===base_config - Basic information===<br />
*robot_name: Product name<br />
*module_type: Module type (0 - No module, 1 - Robotic arm, 2 - Gimbal)<br />
*sbc_version: Host version<br />
<br />
===sbc_config - Upper computer settings===<br />
<br />
* feedback_interval: Interval for receiving feedback information<br />
*disabled_http_log: Disable HTTP server log information<br />
<br />
=== args_config - Robot parameter settings===<br />
* max_speed: Maximum speed<br />
* slow_speed: Low-speed speed<br />
* max_rate: Maximum speed rate<br />
* mid_rate: Medium speed rate<br />
* min_rate: Low-speed speed rate<br />
<br />
=== cv - OpenCV parameter settings===<br />
<br />
*default_color: Default target color for color recognition<br />
*color_lower: HSV LOWER value of target color<br />
*color_upper: HSV UPPER value of target color<br />
*min_radius: Radius threshold of target area<br />
*sampling_rad: Sampling area radius<br />
*track_color_iterate: Color tracking (pan-tilt) speed rate<br />
*track_faces_iterate: Face tracking (pan-tilt) speed rate<br />
*track_spd_rate: Pan-tilt rotation speed<br />
*track_acc_rate: Pan-tilt rotation acceleration<br />
*aimed_error: Aiming lock judgment threshold<br />
<br />
=== cmd_config - Instruction type codes===<br />
These codes are related to the definitions of instructions in the lower computer program and are used for communication debugging between the upper and lower computers. Changing these involves changing the lower computer program.<br />
<br />
===code - Function codes===<br />
These codes correspond to functions, and the frontend page also needs to load this .yaml file to obtain these configurations. This way, when the frontend web application communicates with the backend, different buttons correspond to different functions. Change only if necessary.<br />
=== fb - Feedback information codes===<br />
These codes correspond to types of feedback information. Some of this feedback is from the chassis to the host controller, and some is from the backend to the frontend. Using the same .yaml file for both the backend and frontend ensures uniformity of these codes and feedback information types. Change only if necessary.s and feedback information types. Change only if necessary.</div>Eng52https://www.waveshare.com/wiki/25_Introduction_to_Main_Program_Architecture25 Introduction to Main Program Architecture2024-03-22T02:42:49Z<p>Eng52: /* app.py Introduction (v0.89) */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
== File Structure and Functionality==<br />
<br />
*ugv_pt_rpi<br />
**[Folder] AccessPopup (Used for network connection-related functions)<br />
**[Folder] sounds (Used to store audio files, where you can configure voice packages)<br />
**[Folder] static (Used to store captured photos)<br />
**[Folder] templates (Related resources for the web application)<br />
**[Folder] tutorial_cn (Chinese version of interactive tutorials)<br />
**[Folder] tutorial_en (English version of interactive tutorials)<br />
**[Folder] videos (Used to store recorded videos)<br />
**app.py (Main program of the product, including web-socket and Flask-related functionalities)<br />
**asound.conf (Sound card configuration file)<br />
**audio_ctrl.py (Library related to audio functionalities)<br />
**autorun.sh (Script to configure automatic startup of the main program and JupyterLab)<br />
**base_camera.py (Library for flask real-time video streaming with underlying multithreaded capture, original project is flask-video-streaming)<br />
**base_ctrl.py (Library for communication with the lower computer, communicates with the lower computer via serial port)<br />
**config.yaml (Configuration file used to configure some parameters)<br />
**requirements.txt (Python project dependencies)<br />
**robot_ctrl.py (Library for robot actions and visual processing)<br />
**serial_simple_ctrl.py (Standalone program used for testing serial communication)<br />
**setup.sh (Automatic installation script)<br />
**start_jupyter.sh (Script to start the JupyterLab server)he JupyterLab server)<br />
<br />
==Installation Script==<br />
In the project folder, there is a file named setup.sh, written in shell script, which helps automate the configuration of the upper computer for the robot product. This includes setting up the serial port, configuring the camera, creating a project virtual environment, and installing dependencies. These steps are already configured in the SD card image we provide.<br><br />
Usage of the installation script: The installation process involves downloading and installing many dependencies from the internet. For areas with special network environments, we recommend downloading the image file directly from our official website to install the product.<br />
==Automatic Program Execution==<br />
The autorun.sh script in the project folder is used to configure the automatic startup (as a user, not root) of the main program (app.py) and JupyterLab (start_jupyter.sh), and generates the configuration file for JupyterLab.terLab.<br />
<br />
== app.py Introduction (v0.89)==<br />
The following code block is for demonstration purposes only and cannot be executed. <br><br />
Import the Flask application and libraries related to JSON for building a web application.<br />
<br />
<syntaxhighlight lang="python"><br />
from importlib import import_module<br />
import os, socket, psutil<br />
import subprocess, re, netifaces<br />
from flask import Flask, render_template, Response, jsonify, request, send_from_directory, send_file<br />
from werkzeug.utils import secure_filename<br />
import json<br />
</syntaxhighlight><br />
Import libraries related to web-socket.<br />
<syntaxhighlight lang="python"><br />
from flask_socketio import SocketIO, emit<br />
</syntaxhighlight><br />
Import libraries related to robot control, including visual functions and action control.<br />
<syntaxhighlight lang="python"><br />
from robot_ctrl import Camera<br />
from robot_ctrl import RobotCtrlMiddleWare<br />
</syntaxhighlight><br />
Import other libraries.<br />
<syntaxhighlight lang="python"><br />
import time # Time function library<br />
import logging # Used to set output information for Flask application<br />
import threading # Threading function library<br />
<br />
import yaml # Used to read .yaml configuration files<br />
</syntaxhighlight><br />
Open the config.yaml configuration file to retrieve parameters from the configuration file.<br />
<syntaxhighlight lang="python"><br />
curpath = os.path.realpath(__file__)<br />
thisPath = os.path.dirname(curpath)<br />
with open(thisPath + '/config.yaml', 'r') as yaml_file:<br />
f = yaml.safe_load(yaml_file)<br />
<br />
robot_name = f['base_config']['robot_name']<br />
sbc_version = f['base_config']['sbc_version']<br />
</syntaxhighlight><br />
Instantiate the Flask application and configure output (turn off debug output).<br />
<syntaxhighlight lang="python"><br />
app = Flask(__name__)<br />
log = logging.getLogger('werkzeug')<br />
log.disabled = True<br />
</syntaxhighlight><br />
Instantiate WebSocket functionality (for communication between the web client and the server), video-related functionality (real-time video, OpenCV), and robot action control (movement, lighting, gimbal control, obtaining chassis feedback, etc.).<br />
<syntaxhighlight lang="python"><br />
socketio = SocketIO(app)<br />
camera = Camera()<br />
robot = RobotCtrlMiddleWare()<br />
</syntaxhighlight><br />
Network-related settings<br />
<syntaxhighlight lang="python"><br />
net_interface = "wlan0" # Set wireless network interface, onboard is wlan0, USB is other numbers<br />
wifi_mode = "None"<br />
# Store WIFI mode, this variable will be displayed on the OLED screen<br />
eth0_ip = None # IP address of the Ethernet port, will be displayed on the OLED screen<br />
wlan_ip = None # IP address of the wireless network (net_interface), will be displayed on the OLED screen<br />
</syntaxhighlight><br />
Storage path for audio files uploaded via drag-and-drop on the webpage.<br />
<syntaxhighlight lang="python"><br />
UPLOAD_FOLDER = thisPath + '/sounds/others'<br />
</syntaxhighlight><br />
Variables used to store information from the upper computer (these variables will be updated by other functions during the main program execution).<br />
<syntaxhighlight lang="python"><br />
pic_size = 0;<br />
vid_size = 0;<br />
cpu_read = 0;<br />
cpu_temp = 0;<br />
ram_read = 0;<br />
rssi_read= 0;<br />
</syntaxhighlight><br />
<br />
A dictionary containing executable commands and their corresponding functions. Each command has a unique code stored in the config.yaml file. Here, we use a dictionary to select the command to be executed because there are too many commands to use a series of if-else statements, which would severely impact readability.<br />
<syntaxhighlight lang="python"><br />
cmd_actions = {<br />
f['code']['min_res']: lambda: camera.set_video_resolution("240P"),<br />
f['code']['mid_res']: lambda: camera.set_video_resolution("480P"),<br />
f['code']['max_res']: lambda: camera.set_video_resolution("960P"),<br />
f['code']['zoom_x1']: lambda: camera.scale_frame(1),<br />
f['code']['zoom_x2']: lambda: camera.scale_frame(2),<br />
f['code']['zoom_x4']: lambda: camera.scale_frame(4),<br />
f['code']['pic_cap']: lambda: camera.capture_frame(thisPath + '/static/'),<br />
f['code']['vid_sta']: lambda: camera.record_video(1, thisPath + '/videos/'),<br />
f['code']['vid_end']: lambda: camera.record_video(0, thisPath + '/videos/'),<br />
f['code']['cv_none']: lambda: camera.set_cv_mode(f['code']['cv_none']),<br />
f['code']['cv_moti']: lambda: camera.set_cv_mode(f['code']['cv_moti']),<br />
f['code']['cv_face']: lambda: camera.set_cv_mode(f['code']['cv_face']),<br />
f['code']['cv_objs']: lambda: camera.set_cv_mode(f['code']['cv_objs']),<br />
f['code']['cv_clor']: lambda: camera.set_cv_mode(f['code']['cv_clor']),<br />
f['code']['cv_hand']: lambda: camera.set_cv_mode(f['code']['cv_hand']),<br />
f['code']['cv_auto']: lambda: camera.set_cv_mode(f['code']['cv_auto']),<br />
f['code']['mp_face']: lambda: camera.set_cv_mode(f['code']['mp_face']),<br />
f['code']['mp_pose']: lambda: camera.set_cv_mode(f['code']['mp_pose']),<br />
f['code']['re_none']: lambda: camera.set_detection_reaction(f['code']['re_none']),<br />
f['code']['re_capt']: lambda: camera.set_detection_reaction(f['code']['re_capt']),<br />
f['code']['re_reco']: lambda: camera.set_detection_reaction(f['code']['re_reco']),<br />
f['code']['mc_lock']: lambda: camera.set_movtion_lock(f['code']['mc_lock']),<br />
f['code']['mc_unlo']: lambda: camera.set_movtion_lock(f['code']['mc_unlo']),<br />
f['code']['led_off']: robot.set_led_mode_off,<br />
f['code']['led_aut']: robot.set_led_mode_auto,<br />
f['code']['led_ton']: robot.set_led_mode_on,<br />
f['code']['base_of']: robot.set_base_led_off,<br />
f['code']['base_on']: robot.set_base_led_on,<br />
f['code']['head_ct']: robot.head_led_ctrl,<br />
f['code']['base_ct']: robot.base_led_ctrl,<br />
f['code']['s_panid']: camera.set_pan_id,<br />
f['code']['release']: camera.release_torque,<br />
f['code']['set_mid']: camera.middle_set,<br />
f['code']['s_tilid']: camera.set_tilt_id<br />
}<br />
</syntaxhighlight><br />
When the webpage loads, it also needs to request the config.yaml file from the server for configuring some information on the webpage, such as displaying the product name or unifying the codes corresponding to the instructions with the server. The webpage sends a request through the "/config" route, and the server returns the config.yaml file that the webpage client needs on this route.<br />
<syntaxhighlight lang="python"><br />
@app.route('/config')<br />
def get_config():<br />
with open(thisPath + '/config.yaml', 'r') as file:<br />
yaml_content = file.read()<br />
return yaml_content<br />
</syntaxhighlight><br />
<br />
Obtaining the WIFI signal strength, with the parameter being the name of the wireless network interface (a device may contain multiple wireless network interfaces).<br />
<syntaxhighlight lang="python"><br />
def get_signal_strength(interface):<br />
try:<br />
output = subprocess.check_output(["/sbin/iwconfig", interface]).decode("utf-8")<br />
signal_strength = re.search(r"Signal level=(-\d+)", output)<br />
if signal_strength:<br />
return int(signal_strength.group(1))<br />
return 0<br />
except FileNotFoundError:<br />
print("iwconfig command not found. Please ensure it's installed and in your PATH.")<br />
return -1<br />
except subprocess.CalledProcessError as e:<br />
print(f"Error executing iwconfig: {e}")<br />
return -1<br />
except Exception as e:<br />
print(f"An error occurred: {e}")<br />
return -1<br />
</syntaxhighlight><br />
<br />
Obtaining the WIFI mode, determining whether WIFI is in AP (Access Point) or STA (Station) mode.<br />
<syntaxhighlight lang="python"><br />
def get_wifi_mode():<br />
global wifi_mode<br />
try:<br />
result = subprocess.check_output(['/sbin/iwconfig', 'wlan0'], encoding='utf-8')<br />
<br />
if "Mode:Master" in result or "Mode:AP" in result:<br />
wifi_mode = "AP"<br />
return "AP"<br />
<br />
if "Mode:Managed" in result:<br />
wifi_mode = "STA"<br />
return "STA"<br />
<br />
except subprocess.CalledProcessError as e:<br />
print(f"Error checking Wi-Fi mode: {e}")<br />
return None<br />
<br />
return None<br />
</syntaxhighlight><br />
<br />
Obtaining the IP address, with the required parameter being the network interface name.<br />
<syntaxhighlight lang="python"><br />
def get_ip_address(interface):<br />
try:<br />
interface_info = netifaces.ifaddresses(interface)<br />
<br />
ipv4_info = interface_info.get(netifaces.AF_INET, [{}])<br />
return ipv4_info[0].get('addr')<br />
except ValueError:<br />
print(f"Interface {interface} not found.")<br />
return None<br />
except IndexError:<br />
print(f"No IPv4 address assigned to {interface}.")<br />
return None<br />
</syntaxhighlight><br />
<br />
Obtaining CPU usage percentage. This function will block, with the blocking time being the interval parameter of cpu_percent().<br />
<syntaxhighlight lang="python"><br />
def get_cpu_usage():<br />
return psutil.cpu_percent(interval=2)<br />
</syntaxhighlight><br />
Obtaining CPU temperature.<br />
<syntaxhighlight lang="python"><br />
def get_cpu_temperature():<br />
try:<br />
temperature_str = os.popen('vcgencmd measure_temp').readline()<br />
temperature = float(temperature_str.replace("temp=", "").replace("'C\n", ""))<br />
return temperature<br />
except Exception as e:<br />
print("Error reading CPU temperature:", str(e))<br />
return None<br />
</syntaxhighlight><br />
Obtaining the usage percentage of the running memory (RAM).<br />
<syntaxhighlight lang="python"><br />
def get_memory_usage():<br />
return psutil.virtual_memory().percent<br />
</syntaxhighlight><br />
<br />
This function integrates the above functions to obtain various information and assign them to corresponding variables, making it convenient for other parts of the program to access those variables.<br />
<syntaxhighlight lang="python"><br />
def update_device_info():<br />
global pic_size, vid_size, cpu_read, ram_read, rssi_read, cpu_temp<br />
cpu_read = get_cpu_usage()<br />
cpu_temp = get_cpu_temperature()<br />
ram_read = get_memory_usage()<br />
rssi_read= get_signal_strength(net_interface)<br />
</syntaxhighlight><br />
This function is used to continuously obtain various feedback information, merge them, and then send them to the web client. It is executed in a separate thread generated by threading when the main program is started (when the client first establishes a connection). It will not affect the execution of the main program. This function retrieves information from camera.get_status() to obtain information about the chassis and other visual function-related information. It returns a merged JSON to the web client on the "/ctrl" route.<br><br />
The frequency of this function is 10Hz, but the frequency of obtaining some information is not 10Hz. This frequency is the frequency of obtaining information in camera.get_status(). The frequency of obtaining other information such as folder size and CPU memory usage is much lower because those pieces of information consume more resources.<br />
<syntaxhighlight lang="python"><br />
def update_data_websocket():<br />
while 1:<br />
try:<br />
fb_json = camera.get_status()<br />
except:<br />
continue<br />
socket_data = {<br />
f['fb']['picture_size']:pic_size,<br />
f['fb']['video_size']: vid_size,<br />
f['fb']['cpu_load']: cpu_read,<br />
f['fb']['cpu_temp']: cpu_temp,<br />
f['fb']['ram_usage']: ram_read,<br />
f['fb']['wifi_rssi']: rssi_read<br />
}<br />
try:<br />
socket_data.update(fb_json)<br />
socketio.emit('update', socket_data, namespace='/ctrl')<br />
except:<br />
pass<br />
time.sleep(0.1)<br />
</syntaxhighlight><br />
All the @app.route() decorators below are Flask application route function decorators. Routes are used to distinguish the types of requests sent by the client, and different functions are used to handle different types of requests.<br><br />
This route is the main route. When a client connects (when someone accesses the IP:5000 page), a random audio file from the sounds/connected folder will be played, and the server will return the main control interface of the WEB application. The HTML file for the main control interface is index.html.<br />
<syntaxhighlight lang="python"><br />
@app.route('/')<br />
def index():<br />
"""Video streaming home page."""<br />
robot.play_random_audio("connected", False)<br />
return render_template('index.html')<br />
</syntaxhighlight><br />
<br />
Routes used to send various files to the client: CSS, JavaScript, photos, and videos.<br />
<syntaxhighlight lang="python"><br />
@app.route('/<path:filename>')<br />
def serve_static(filename):<br />
return send_from_directory('templates', filename)<br />
<br />
<br />
@app.route('/photo/<path:filename>')<br />
def serve_static_photo(filename):<br />
return send_from_directory('templates', filename)<br />
<br />
<br />
@app.route('/video/<path:filename>')<br />
def serve_static_video(filename):<br />
return send_from_directory('templates', filename)<br />
</syntaxhighlight><br />
Route used to open the settings page.<br />
<syntaxhighlight lang="python"><br />
@app.route('/settings/<path:filename>')<br />
def serve_static_settings(filename):<br />
return send_from_directory('templates', filename)<br />
</syntaxhighlight><br />
Route used to return from the settings page to the homepage.<br />
<syntaxhighlight lang="python"><br />
@app.route('/index')<br />
def serve_static_home(filename):<br />
return redirect(url_for('index'))<br />
</syntaxhighlight><br />
Function used to obtain real-time video frames, sourced from the open-source project Flask Video Streaming.<br />
<syntaxhighlight lang="python"><br />
def gen(cameraInput):<br />
"""Video streaming generator function."""<br />
yield b'--frame\r\n'<br />
while True:<br />
frame = cameraInput.get_frame()<br />
yield b'Content-Type: image/jpeg\r\n\r\n' + frame + b'\r\n--frame\r\n'<br />
</syntaxhighlight><br />
Route used to display real-time video frames on the webpage.<br />
<syntaxhighlight lang="python"><br />
@app.route('/video_feed')<br />
def video_feed():<br />
"""Video streaming route. Put this in the src attribute of an img tag."""<br />
return Response(gen(camera),<br />
mimetype='multipart/x-mixed-replace; boundary=frame')<br />
</syntaxhighlight><br />
Route used to obtain a list of photo names in the photo folder.<br />
<syntaxhighlight lang="python"><br />
@app.route('/get_photo_names')<br />
def get_photo_names():<br />
photo_files = sorted(os.listdir(thisPath + '/static'), key=lambda x: os.path.getmtime(os.path.join(thisPath + '/static', x)), reverse=True)<br />
return jsonify(photo_files)<br />
</syntaxhighlight><br />
Route used to send images to the webpage.<br />
<syntaxhighlight lang="python"><br />
@app.route('/get_photo/<filename>')<br />
def get_photo(filename):<br />
return send_from_directory(thisPath + '/static', filename)<br />
</syntaxhighlight><br />
Route used to delete images.<br />
<syntaxhighlight lang="python"><br />
@app.route('/delete_photo', methods=['POST'])<br />
def delete_photo():<br />
filename = request.form.get('filename')<br />
try:<br />
os.remove(os.path.join(thisPath + '/static', filename))<br />
return jsonify(success=True)<br />
except Exception as e:<br />
print(e)<br />
return jsonify(success=False)<br />
</syntaxhighlight><br />
<br />
The following functions perform similar functions but are used to manipulate videos.<br />
<syntaxhighlight lang="python"><br />
@app.route('/delete_video', methods=['POST'])<br />
def delete_video():<br />
filename = request.form.get('filename')<br />
try:<br />
os.remove(os.path.join(thisPath + '/videos', filename))<br />
return jsonify(success=True)<br />
except Exception as e:<br />
print(e)<br />
return jsonify(success=False)<br />
<br />
<br />
@app.route('/get_video_names')<br />
def get_video_names():<br />
video_files = sorted(<br />
[filename for filename in os.listdir(thisPath + '/videos/') if filename.endswith('.mp4')],<br />
key=lambda filename: os.path.getctime(os.path.join(thisPath + '/videos/', filename)),<br />
reverse=True<br />
)<br />
return jsonify(video_files)<br />
<br />
<br />
@app.route('/videos/<path:filename>')<br />
def videos(filename):<br />
return send_from_directory(thisPath + '/videos', filename)<br />
</syntaxhighlight><br />
Using a method that iterates through each file internally to obtain the size of a folder will result in significant resource consumption.<br />
<br />
<syntaxhighlight lang="python"><br />
def get_folder_size(folder_path):<br />
total_size = 0<br />
for dirpath, dirnames, filenames in os.walk(folder_path):<br />
for filename in filenames:<br />
file_path = os.path.join(dirpath, filename)<br />
total_size += os.path.getsize(file_path)<br />
# Convert total_size to MB<br />
size_in_mb = total_size / (1024 * 1024)<br />
return round(size_in_mb,2)<br />
</syntaxhighlight><br />
WebSocket routes are used to receive JSON instructions from the client. Some instructions are high-frequency and require low latency, so WebSocket is used here instead of HTTP. WebSocket is connection-oriented, allowing multiple communications per connection, while HTTP is connectionless, requiring connection establishment-communication-connection destruction for each request. HTTP is not suitable for high-frequency low-latency communication (HTTP's advantage is simplicity).<br />
<syntaxhighlight lang="python"><br />
@socketio.on('json', namespace='/json')<br />
def handle_socket_json(json):<br />
try:<br />
robot.json_command_handler(json)<br />
except Exception as e:<br />
print("Error handling JSON data:", e)<br />
return<br />
</syntaxhighlight><br />
<br />
Function to update the information displayed on the OLED. This function is executed in a separate thread generated by threading during the main program execution. There is another function that feeds information to the webpage at a frequency of 10Hz. Some variables in that function are updated by this function. The information here does not require high real-time performance, so the frequency of information retrieval in this function is lower. (This function does not use time.sleep() for delay, but rather relies on get_cpu_usage() in update_device_info() to implement delay).<br />
<syntaxhighlight lang="python"><br />
def oled_update():<br />
global eth0_ip, wlan_ip<br />
robot.base_oled(0, f"E: No Ethernet")<br />
robot.base_oled(1, f"W: NO {net_interface}")<br />
robot.base_oled(2, "F/J:5000/8888")<br />
get_wifi_mode()<br />
start_time = time.time()<br />
last_folder_check_time = 0<br />
<br />
while True:<br />
current_time = time.time()<br />
<br />
if current_time - last_folder_check_time > 600:<br />
pic_size = get_folder_size(thisPath + '/static')<br />
vid_size = get_folder_size(thisPath + '/videos')<br />
last_folder_check_time = current_time<br />
<br />
update_device_info() # the interval of this loop is set in here<br />
get_wifi_mode()<br />
<br />
if get_ip_address('eth0') != eth0_ip:<br />
eth0_ip = get_ip_address('eth0');<br />
if eth0_ip:<br />
robot.base_oled(0, f"E:{eth0_ip}")<br />
else:<br />
robot.base_oled(0, f"E: No Ethernet")<br />
<br />
if get_ip_address(net_interface) != wlan_ip:<br />
wlan_ip = get_ip_address(net_interface)<br />
if wlan_ip:<br />
robot.base_oled(1, f"W:{wlan_ip}")<br />
else:<br />
robot.base_oled(1, f"W: NO {net_interface}")<br />
<br />
elapsed_time = current_time - start_time<br />
hours = int(elapsed_time // 3600)<br />
minutes = int((elapsed_time % 3600) // 60)<br />
seconds = int(elapsed_time % 60)<br />
robot.base_oled(3, f"{wifi_mode} {hours:02d}:{minutes:02d}:{seconds:02d} {rssi_read}dBm")<br />
</syntaxhighlight><br />
This route is used to handle command-line information sent by the client.<br />
<syntaxhighlight lang="python"><br />
@app.route('/send_command', methods=['POST'])<br />
def handle_command():<br />
command = request.form['command']<br />
print("Received command:", command)<br />
# camera.info_update("CMD:" + command, (0,255,255), 0.36)<br />
camera.cmd_process(command)<br />
return jsonify({"status": "success", "message": "Command received"})<br />
</syntaxhighlight><br />
<br />
Route used to obtain a list of audio files stored in the sounds/other folder.<br />
<syntaxhighlight lang="python"><br />
@app.route('/getAudioFiles', methods=['GET'])<br />
def get_audio_files():<br />
files = [f for f in os.listdir(UPLOAD_FOLDER) if os.path.isfile(os.path.join(UPLOAD_FOLDER, f))]<br />
return jsonify(files)<br />
</syntaxhighlight><br />
<br />
Route used to implement drag-and-drop upload functionality on the webpage. The uploaded audio files are saved in the sounds/other folder.<br />
<syntaxhighlight lang="python"><br />
@app.route('/uploadAudio', methods=['POST'])<br />
def upload_audio():<br />
if 'file' not in request.files:<br />
return jsonify({'error': 'No file part'})<br />
file = request.files['file']<br />
if file.filename == '':<br />
return jsonify({'error': 'No selected file'})<br />
if file:<br />
filename = secure_filename(file.filename)<br />
file.save(os.path.join(UPLOAD_FOLDER, filename))<br />
return jsonify({'success': 'File uploaded successfully'})<br />
</syntaxhighlight><br />
Route used to play audio files stored in the sounds/other folder.<br />
<syntaxhighlight lang="python"><br />
@app.route('/playAudio', methods=['POST'])<br />
def play_audio():<br />
audio_file = request.form['audio_file']<br />
print(thisPath + '/sounds/others/' + audio_file)<br />
robot.audio_play(thisPath + '/sounds/others/' + audio_file)<br />
return jsonify({'success': 'Audio is playing'})<br />
</syntaxhighlight><br />
Route used to stop playback.<br />
<syntaxhighlight lang="python"><br />
@app.route('/stop_audio', methods=['POST'])<br />
def audio_stop():<br />
robot.audio_stop()<br />
return jsonify({'success': 'Audio stop'})<br />
</syntaxhighlight><br />
This function is used to execute certain command-line instructions automatically at startup. It will be executed automatically when the main program starts. You can freely add more instructions. Here are some examples:<br />
*base -c {"T":142,"cmd":50}: Sets the feedback interval for the base. The default loop of the base has no delay. Adding a parameter of 50ms helps to improve the efficiency of the host controller (decoding the serial port information from the lower computer also consumes resources).<br />
*base -c {"T":131,"cmd":1}: Enables continuous feedback from the base, so that feedback information is automatically sent continuously instead of in a request-response manner.<br />
*base -c {"T":143,"cmd":0}: Disables echo, so the base does not echo the original information sent to it. This saves resources, especially when controlling the base with high-frequency instructions.<br />
*base -c {"T":4,"cmd":2}: Sets the type of peripheral for the base. 0 means no peripheral, 1 means a manipulator, and 2 means a gimbal.<br />
*base -c {"T":300,"mode":0,"mac":"EF:EF:EF:EF:EF:EF"}: Configures the base to not be controlled by ESP-NOW broadcast signals, but only by ESP-NOW instructions sent from the MAC address EF:EF:EF:EF:EF:EF. You can change this MAC address as needed.<br />
*send -a -b: Adds the broadcast address to the ESP-NOW peer list of the lower computer, facilitating inter-device communication functionalities.<br />
<syntaxhighlight lang="python"><br />
def cmd_on_boot():<br />
cmd_list = [<br />
'base -c {"T":142,"cmd":50}', # set feedback interval<br />
'base -c {"T":131,"cmd":1}', # serial feedback flow on<br />
'base -c {"T":143,"cmd":0}', # serial echo off<br />
'base -c {"T":4,"cmd":2}', # select the module - 0:None 1:RoArm-M2-S 2:Gimbal<br />
'base -c {"T":300,"mode":0,"mac":"EF:EF:EF:EF:EF:EF"}', # the base won't be ctrl by esp-now broadcast cmd, but it can still recv broadcast megs.<br />
'send -a -b' # add broadcast mac addr to peer<br />
]<br />
for i in range(0, len(cmd_list)):<br />
camera.cmd_process(cmd_list[i])<br />
</syntaxhighlight><br />
When the main program is running, it will perform the following tasks:<br />
<syntaxhighlight lang="python"><br />
if __name__ == '__main__':<br />
# Randomly play an audio file from the sounds/robot_started folder<br />
robot.play_random_audio("robot_started", False)<br />
<br />
# Turn on the LED lights<br />
robot.set_led_mode_on()<br />
<br />
# Create a separate thread for update_data_websocket()<br />
date_update_thread = threading.Thread(target=update_data_websocket, daemon=True)<br />
date_update_thread.start()<br />
<br />
# Create another thread for update_data_websocket()<br />
oled_update_thread = threading.Thread(target=oled_update, daemon=True)<br />
oled_update_thread.start()<br />
<br />
# Get the size of the photo folder<br />
pic_size = get_folder_size(thisPath + '/static')<br />
<br />
# Get the size of the video folder<br />
vid_size = get_folder_size(thisPath + '/videos')<br />
<br />
# Turn off the LED lights<br />
robot.set_led_mode_off()<br />
<br />
# Execute commands on boot<br />
cmd_on_boot()<br />
<br />
# Start the Flask application server<br />
socketio.run(app, host='0.0.0.0', port=5000, allow_unsafe_werkzeug=True)<br />
<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/24_Simple_Web-Based_Application24 Simple Web-Based Application2024-03-22T02:09:33Z<p>Eng52: Created page with "<div class="wiki-pages jet-green-color"> In the previous chapters, we introduced how to use Flask to achieve low-latency image transmission, which is used to transfer the came..."</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
In the previous chapters, we introduced how to use Flask to achieve low-latency image transmission, which is used to transfer the camera feed to the web application interface. Here, we will discuss how to pass the information entered on the web application interface to the backend of the web application. This functionality is used to control the robot using the web application.<br />
<syntaxhighlight lang="python"><br />
from flask import Flask, request<br />
<br />
app = Flask(__name__)<br />
<br />
@app.route('/', methods=['GET', 'POST'])<br />
def index():<br />
if request.method == 'POST':<br />
# Get form data<br />
form_data = request.form<br />
# Print form data<br />
print(form_data)<br />
# Return a simple response<br />
return 'Received data: {}'.format(form_data)<br />
else:<br />
# If it's a GET request, return a simple form page<br />
return '''<br />
<form method="post"><br />
<label for="input_data">Input data:</label><br><br />
<input type="text" id="input_data" name="input_data"><br><br />
<input type="submit" value="Submit"><br />
</form><br />
'''<br />
<br />
if __name__ == '__main__':<br />
app.run(host='0.0.0.0')<br />
</syntaxhighlight><br />
<br />
You can select the code block above and press Ctrl + Enter to run it. If you encounter a port conflict, it means you've previously run this code block. You'll need to click "Kernel" in the JupyterLab menu bar and then select "Shut Down All Kernels" to release the resources, including network port resources, occupied by the previously run code block. After that, you can rerun the code block to run this Flask application.<br />
<br />
Once you run the code block, you'll see messages like "Running on http://127.0.0.1:5000" and "Running on http://[IP]:5000". Usually, the "[IP]" here refers to the IP address assigned to your Raspberry Pi by your router. You can open a web browser on any device within the same local network and visit "[IP]:5000". Note that the ':' symbol here must be the English colon, representing the port number 5000 of the IP address you're accessing.<br />
<br />
Upon visiting this page, you'll see an input box and a "Submit" button. You can enter some content into the input box and then click the "Submit" button. After that, you can see the content you entered on the web page, and the backend will display the content it received below the code block in JupyterLab.</div>Eng52https://www.waveshare.com/wiki/23_Pose_Detection_with_MediaPipe23 Pose Detection with MediaPipe2024-03-22T02:07:08Z<p>Eng52: /* Terminate the Main Program */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This section describes how to implement pose detection using MediaPipe + OpenCV.<br />
<br />
==What is MediaPipe?==<br />
MediaPipe is an open-source framework developed by Google for building machine learning-based multimedia processing applications. It provides a set of tools and libraries for processing video, audio, and image data, and applies machine learning models to achieve various functionalities such as pose estimation, gesture recognition, and face detection. MediaPipe is designed to offer efficient, flexible, and easy-to-use solutions, enabling developers to quickly build a variety of multimedia processing applications.<br />
<br />
== Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
=== Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter '''1''' and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a '''#''' character at the beginning of the line with '''……app.py >> ……''' to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer? ''' Enter '''Y ''' and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' at the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the lower machine continues to communicate with the upper machine through the serial port, the host may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the upper machine is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Example==<br />
The following code block can be executed directly:<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press '''STOP''' to close the real-time video and release the camera resources.<br />
<br />
=== If the real-time camera view is not visible during execution===<br />
*Click on Kernel - Shut down all kernels above.<br />
*Close the current chapter tab and reopen it.<br />
*Click '''STOP''' to release the camera resources, then run the code block again.<br />
*Reboot the device.<br />
<br />
=== Features===<br />
When the code block runs normally, MediaPipe will automatically mark the joints of the human body when there is a face in the frame.ace in the frame.<br />
<syntaxhighlight lang="python"><br />
import cv2 # Import the OpenCV library for image processing<br />
import imutils, math # Auxiliary libraries for image processing and mathematical operations<br />
from picamera2 import Picamera2 # Library to access the Raspberry Pi Camera<br />
from IPython.display import display, Image # Library to display images in Jupyter Notebook<br />
import ipywidgets as widgets # Library for creating interactive widgets, such as buttons<br />
import threading # Library for creating new threads for asynchronous execution of tasks<br />
import mediapipe as mp # Import the MediaPipe library for pose detection<br />
<br />
# Create a "Stop" button that users can click to stop the video stream<br />
# ================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
# Initialize MediaPipe's drawing tools and pose detection model<br />
mpDraw = mp.solutions.drawing_utils<br />
<br />
# MediaPipe Pose Detection<br />
mp_pose = mp.solutions.pose<br />
pose = mp_pose.Pose(static_image_mode=False, <br />
model_complexity=1, <br />
smooth_landmarks=True, <br />
min_detection_confidence=0.5, <br />
min_tracking_confidence=0.5)<br />
<br />
# Define the display function to process video frames and perform pose detection<br />
def view(button):<br />
# picam2 = Picamera2() # Create an instance of Picamera2<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)})) # Configure camera parameters<br />
# picam2.start() # Start the camera<br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle=display(None, display_id=True) # Create a display handle to update the displayed image<br />
<br />
while True:<br />
#frame = picam2.capture_array()<br />
_, frame = camera.read()<br />
# frame = cv2.flip(frame, 1) # if your camera reverses your image<br />
<br />
# frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)<br />
<br />
results = pose.process(img) # Use MediaPipe to process the image and get pose detection results<br />
<br />
# If pose landmarks are detected<br />
if results.pose_landmarks:<br />
frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR) # Convert the image from RGB to BGR for drawing<br />
mpDraw.draw_landmarks(frame, results.pose_landmarks, mp_pose.POSE_CONNECTIONS) # Use MediaPipe's drawing tools to draw pose landmarks and connections<br />
frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) # Convert the image back from BGR to RGB for display<br />
<br />
_, frame = cv2.imencode('.jpeg', frame) # Encode the processed frame into JPEG format<br />
display_handle.update(Image(data=frame.tobytes())) # Update the displayed image<br />
if stopButton.value==True: # Check if the "Stop" button is pressed<br />
picam2.close() # If so, close the camera<br />
display_handle.update(None) # Clear the displayed content<br />
<br />
# Display the "Stop" button and start a thread to run the display function<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start() # Start the thread<br />
<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/22_Face_Recognition_Based_on_MediaPipe22 Face Recognition Based on MediaPipe2024-03-22T02:01:47Z<p>Eng52: /* Terminate the Main Program */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This section introduces how to implement face recognition using MediaPipe + OpenCV.<br />
<br />
== What is MediaPipe?==<br />
MediaPipe is an open-source framework developed by Google for building machine learning-based multimedia processing applications. It provides a set of tools and libraries for processing video, audio, and image data, and applies machine learning models to achieve various functionalities such as pose estimation, gesture recognition, and face detection. MediaPipe is designed to offer efficient, flexible, and easy-to-use solutions, enabling developers to quickly build a variety of multimedia processing applications.<br />
<br />
==Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
<br />
=== Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter 1 and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a '''#''' character at the beginning of the line with '''……app.py >> ……''' to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer?''' Enter '''Y''' and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the slave controller continues to communicate with the upper machine through the serial port, the host may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the host is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Example==<br />
<br />
The following code block can be run directly:<br />
<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press '''STOP''' to close the real-time video and release the camera resources.<br />
<br />
===If you cannot see the real-time camera feed when running:===<br />
<br />
*Click on Kernel -> Shut down all kernels above.<br />
*Close the current section tab and open it again.<br />
*Click '''STOP''' to release the camera resources, then run the code block again.<br />
*Reboot the device.<br />
<br />
===Features of this Section===<br />
<br />
When the code block runs correctly, MediaPipe will automatically detect faces in the video feed, outlining the position of the face and marking the facial features.<br />
<syntaxhighlight lang="python"><br />
import cv2 # Import the OpenCV library for image processing<br />
import imutils, math # Libraries for auxiliary image processing and mathematical operations<br />
from picamera2 import Picamera2 # Library for accessing the Raspberry Pi Camera<br />
from IPython.display import display, Image # Library for displaying images in Jupyter Notebook<br />
import ipywidgets as widgets # Library for creating interactive interface widgets such as buttons<br />
import threading # Library for creating new threads for asynchronous task execution<br />
import mediapipe as mp # Import the MediaPipe library for face detection<br />
<br />
# Create a "STOP" button that users can click to stop the video stream<br />
# ================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
# Initialize the face detection model of MediaPipe<br />
mpDraw = mp.solutions.drawing_utils<br />
<br />
<br />
# MediaPipe Hand GS<br />
mp_face_detection = mp.solutions.face_detection<br />
face_detection = mp_face_detection.FaceDetection(model_selection=0, min_detection_confidence=0.5)<br />
<br />
# Define a display function to process video frames and perform face detection<br />
def view(button):<br />
# picam2 = Picamera2()<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)}))<br />
# picam2.start()<br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle=display(None, display_id=True)<br />
<br />
while True:<br />
# frame = picam2.capture_array() # Capture a frame from the camera<br />
_, frame = camera.read()<br />
# frame = cv2.flip(frame, 1) # if your camera reverses your image<br />
<br />
# frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)<br />
<br />
results = face_detection.process(img)<br />
<br />
# If a face is detected<br />
if results.detections:<br />
for detection in results.detections: # Iterate through each detected face<br />
frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)<br />
mpDraw.draw_detection(frame, detection) # Use MediaPipe's drawing tools to draw face landmarks<br />
frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
_, frame = cv2.imencode('.jpeg', frame) # Encode the processed frame to JPEG format<br />
display_handle.update(Image(data=frame.tobytes())) # Update the displayed image<br />
if stopButton.value==True: # Check if the "STOP" button is pressed<br />
picam2.close() # If yes, close the camera<br />
display_handle.update(None) # Clear the displayed content<br />
<br />
# Display the "STOP" button and start a thread to run the display function<br />
# ================<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start()<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/PCIe_TO_M.2_(B)PCIe TO M.2 (B)2024-03-21T11:03:23Z<p>Eng20: /* Overview */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|img=[[File:PCIe TO M.2 (B).jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/pcie-to-m.2-b.htm}}]]<br />
|caption=<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
=Overview=<br />
'''PCIe TO M.2 (B) is a PCI-E to M.2 adapter card, for upgrading Solid State Drive, supports CM4.Does not support PI5'''<br />
==Features==<br />
*Support NVMe protocol M.2 interface Solid State Drive protocol, high-speed write/read, higher working efficiency.<br />
*Only support CM4 HAT.<br />
*Supports Raspberry Pi Compute Module 4.<br />
*Compatible with M.2 solid state drive in different sizes.<br />
*The LED lights up when power-on, and keeps blinking while reading/writing.<br />
<br />
=User Guide=<br />
==Mounting Hard Drive to CM4==<br />
===Format===<br />
*Insert the SSD into the corresponding slot of the PCI-E to M.2 adapter, and secure it with the screws provided in the screw package.<br />
*After powering up and booting, execute "lspci" to check the PCIE device.<br />
[[file:PCIe TO M.2 (B)_1.png]]<br />
*Execute "sudo mkfs.ext4 /dev/nvme0n1p1" to format the device. ("Type "mkfs." and then press the "tab" key to see various suffixes representing different formats you can use for formatting.) <br />
Wait for a few moments, when "done" appears, it means that the formatting has been carried out. <br />
[[file:PCIe TO M.2 HAT+_W_4.png]]<br />
<br />
===Mount the Device===<br />
*Create a directory to mount:<br />
sudo mkdir toshiba<br />
*Mount the device:<br />
sudo mount /dev/nvme0n1p1 ./toshiba<br />
*Check the SSD status:<br />
df -h<br />
<br />
===Mount the Hard Drive===<br />
*Create a new directory as the mount point for the hard drive.<br />
<pre><br />
sudo mkdir /home/pi/toshiba<br />
</pre><br />
*Execute the following commands to mount the hard drive: <br />
<pre><br />
sudo mount /dev/nvme0n1p1 /home/pi/toshiba<br />
</pre><br />
Execute again:<br />
<pre><br />
df -h<br />
</pre><br />
Then you can see the hard drive we inserted and the related information, indicating that it has been mounted successfully.<br/><br />
[[File:pcie-m2-4.png|800px]]<br />
*For different hard drives, their names are different, and here is "nvme0n1p1", please refer to the hard drive you inserted.<br />
<br />
===Reading/Writing Test ===<br />
Enter the directory to mount the hard drive: <br />
<pre><br />
cd /home/pi/toshiba<br />
</pre><br />
*Release caches:<br />
<pre><br />
sudo sh -c "sync && echo 3 > /proc/sys/vm/drop_caches"<br />
</pre><br />
*Copying the contents of the Raspberry Pi's memory to the hard drive (Read):<br />
<pre><br />
sudo dd if=/dev/zero of=./test_write count=2000 bs=1024k<br />
</pre><br />
[[File:pcie-m2-5.png|800px]]<br />
*Copying hard drive content to Raspberry Pi memory (Write):<br />
<pre><br />
sudo dd if=./test_write of=/dev/null count=2000 bs=1024k<br />
</pre><br />
[[File:pcie-m2-6new.png|800px]]<br />
*Note: the test result may vary due to different adapters and environments, and the Raspberry Pi is more susceptible.<br/><br />
<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/21_Line_Following_Autonomous_Driving_with_OpenCV21 Line Following Autonomous Driving with OpenCV2024-03-21T10:52:32Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
In this tutorial, we'll use the basic functionalities of OpenCV to detect yellow lines (default color) in the image and control the direction of the chassis based on the position of these lines. Please note that in this example, the chassis won't move. Instead, we'll only showcase the algorithms using OpenCV on the image. For safety reasons, we won't integrate motion control in this tutorial, as it's heavily influenced by external factors. Users should fully understand the code's functionality before adding corresponding motion control features.<br><br />
<br />
If you want to control the robot's movement through this example, please refer to the "Python Chassis Motion Control" section to add the relevant motion control functions (our open-source example is located in robot_ctrl.py).<br />
<br />
==Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
=== Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter '''1 ''' and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
<br />
:8. Add a # character at the beginning of the line with ……app.py >> …… to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer? ''' Enter '''Y''' and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out ……start_jupyter.sh >>…… in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the lower machine continues to communicate with the host through the serial port, the upper machine may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the upper machine is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Example==<br />
The following code block can be run directly:<br><br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press '''STOP''' to close the real-time video and release the camera resources.<br />
<br />
===If you cannot see the real-time camera feed when running:===<br />
*Click on Kernel -> Shut down all kernels above.<br />
*Close the current section tab and open it again.<br />
*Click '''STOP''' to release the camera resources, then run the code block again.<br />
*Reboot the device.<br />
<br />
=== Features of this Section===<br />
After running the following code block, you can place a yellow tape in front of the camera and observe if there are contours of the yellow tape in the black screen. Try to detect the yellow tape using two target detection lines.<br />
<br />
<syntaxhighlight lang="python"><br />
import cv2 # Import OpenCV library for image process <br />
import imutils, math # Library to aid image processing and mathematical operations <br />
from picamera2 import Picamera2 # Library to access Raspberry Pi Camera <br />
import numpy as np<br />
from IPython.display import display, Image # Display images on Jupyter Notebook <br />
import ipywidgets as widgets # Widgets for creating interactive interfaces, such as buttons<br />
import threading # Used to create new threads for asynchronous execution of tasks<br />
<br />
# Stop button<br />
# ================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
<br />
# findline autodrive<br />
<br />
# Upper sampling line, 0.6 for position, higher values<br />
sampling_line_1 = 0.6<br />
<br />
# Lower sampling line, the value needs to be greater than sampling_line_1 and less than 1."<br />
sampling_line_2 = 0.9<br />
<br />
# Detect slope impact <br />
slope_impact = 1.5<br />
<br />
# The effect of line position detected by the lower detection line on turning<br />
base_impact = 0.005<br />
<br />
# The speed impact on turning <br />
speed_impact = 0.5<br />
<br />
# Line tracking speed<br />
line_track_speed = 0.3<br />
<br />
# Effect of slope on patrol speed<br />
slope_on_speed = 0.1<br />
<br />
# Color of target line, HSV color space<br />
line_lower = np.array([25, 150, 70])<br />
line_upper = np.array([42, 255, 255])<br />
<br />
def view(button):<br />
# picam2 = Picamera2()<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)}))<br />
# picam2.start()<br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle=display(None, display_id=True)<br />
<br />
while True:<br />
# img = picam2.capture_array()<br />
_, img = camera.read()<br />
# frame = cv2.flip(frame, 1) # if your camera reverses your image<br />
<br />
# frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
height, width = img.shape[:2]<br />
center_x, center_y = width // 2, height // 2<br />
# Image preprocessing includes color space conversion, Gaussian blur, and color range filtering, etc.<br />
hsv = cv2.cvtColor(img, cv2.COLOR_BGR2HSV)<br />
<br />
line_mask = cv2.inRange(hsv, line_lower, line_upper) # Filter out target lines based on color ranges<br />
line_mask = cv2.erode(line_mask, None, iterations=1) # Eroding operation to remove noise<br />
line_mask = cv2.dilate(line_mask, None, iterations=1) # Expansion operation enhances the target line<br />
<br />
# Detect the target line based on the positions of the upper and lower sampling lines, and calculate steering and velocity control signals according to the detection results<br />
sampling_h1 = int(height * sampling_line_1)<br />
sampling_h2 = int(height * sampling_line_2)<br />
<br />
get_sampling_1 = line_mask[sampling_h1]<br />
get_sampling_2 = line_mask[sampling_h2]<br />
<br />
# Calculate the width of the target line at the upper and lower sampling lines<br />
sampling_width_1 = np.sum(get_sampling_1 == 255)<br />
sampling_width_2 = np.sum(get_sampling_2 == 255)<br />
<br />
if sampling_width_1:<br />
sam_1 = True<br />
else:<br />
sam_1 = False<br />
if sampling_width_2:<br />
sam_2 = True<br />
else:<br />
sam_2 = False<br />
<br />
# Get the edge index of the target line at the upper and lower sampling lines<br />
line_index_1 = np.where(get_sampling_1 == 255)<br />
line_index_2 = np.where(get_sampling_2 == 255)<br />
<br />
# If the target line is detected at the upper sampling line, calculate the center position of the target line<br />
if sam_1:<br />
sampling_1_left = line_index_1[0][0] # Index of the leftmost index of the upper sampling line target line<br />
sampling_1_right = line_index_1[0][sampling_width_1 - 1] # Index to the far right of the upper sampling line target line<br />
sampling_1_center= int((sampling_1_left + sampling_1_right) / 2) # Index of the center of the upper sampling line target line<br />
# If a target line is detected at the lower sampling line, calculate the target line center position<br />
if sam_2:<br />
sampling_2_left = line_index_2[0][0]<br />
sampling_2_right = line_index_2[0][sampling_width_2 - 1]<br />
sampling_2_center= int((sampling_2_left + sampling_2_right) / 2)<br />
<br />
# Initialize steering and speed control signals<br />
line_slope = 0<br />
input_speed = 0<br />
input_turning = 0<br />
<br />
# If the target line is detected at both sampling lines, calculate the slope of the line, and then calculate velocity and steering control signals based on the slope and the position of the target line.<br />
if sam_1 and sam_2:<br />
line_slope = (sampling_1_center - sampling_2_center) / abs(sampling_h1 - sampling_h2) # Calculate the slope of the line<br />
impact_by_slope = slope_on_speed * abs(line_slope) # Calculate the effect on velocity based on the slope<br />
input_speed = line_track_speed - impact_by_slope # Calculated speed control signal<br />
input_turning = -(line_slope * slope_impact + (sampling_2_center - center_x) * base_impact) #+ (speed_impact * input_speed) # Calculating steering control signals<br />
elif not sam_1 and sam_2: # If the target line is detected only at the lower sampling line<br />
input_speed = 0 # Set speed to 0<br />
input_turning = (sampling_2_center - center_x) * base_impact # Calculating steering control signals<br />
elif sam_1 and not sam_2: # If the target line is detected only at the upper sample line<br />
input_speed = (line_track_speed / 3) # slow down<br />
input_turning = 0 # No steering<br />
else: # If neither sampling line detects the target line<br />
input_speed = - (line_track_speed / 3) # backward<br />
input_turning = 0 # No turning<br />
<br />
# base.base_json_ctrl({"T":13,"X":input_speed,"Z":input_turning})<br />
<br />
cv2.putText(line_mask, f'X: {input_speed:.2f}, Z: {input_turning:.2f}', (center_x+50, center_y+0), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (255, 255, 255), 1)<br />
# Visualization operations include drawing lines at the positions of the sampling lines, marking the sampling results, and displaying steering and velocity control signals<br />
cv2.line(line_mask, (0, sampling_h1), (img.shape[1], sampling_h1), (255, 0, 0), 2)<br />
cv2.line(line_mask, (0, sampling_h2), (img.shape[1], sampling_h2), (255, 0, 0), 2)<br />
<br />
if sam_1:<br />
# Draw green marker lines at the ends of the target line at the upper sample line<br />
cv2.line(line_mask, (sampling_1_left, sampling_h1+20), (sampling_1_left, sampling_h1-20), (0, 255, 0), 2)<br />
cv2.line(line_mask, (sampling_1_right, sampling_h1+20), (sampling_1_right, sampling_h1-20), (0, 255, 0), 2)<br />
if sam_2:<br />
# Draw green marker lines at the ends of the target line at the lower sampling line<br />
cv2.line(line_mask, (sampling_2_left, sampling_h2+20), (sampling_2_left, sampling_h2-20), (0, 255, 0), 2)<br />
cv2.line(line_mask, (sampling_2_right, sampling_h2+20), (sampling_2_right, sampling_h2-20), (0, 255, 0), 2)<br />
if sam_1 and sam_2:<br />
# If the target line is detected at both the upper and lower sample lines, draw a red line from the center of the upper sample line to the center of the lower sample line.<br />
cv2.line(line_mask, (sampling_1_center, sampling_h1), (sampling_2_center, sampling_h2), (255, 0, 0), 2)<br />
<br />
_, frame = cv2.imencode('.jpeg', line_mask)<br />
display_handle.update(Image(data=frame.tobytes()))<br />
if stopButton.value==True:<br />
picam2.close()<br />
display_handle.update(None)<br />
<br />
<br />
# Display the "Stop" button and start the thread that displays the function<br />
# ================<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start()<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/20_Gesture_Recognition_Based_on_MediaPipe20 Gesture Recognition Based on MediaPipe2024-03-21T10:44:50Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This section introduces how to implement gesture recognition using MediaPipe + OpenCV.<br />
==What is MediaPipe?==<br />
MediaPipe is an open-source framework developed by Google for building machine learning-based multimedia processing applications. It provides a set of tools and libraries for processing video, audio, and image data, and applies machine learning models to achieve various functionalities such as pose estimation, gesture recognition, and face detection. MediaPipe is designed to offer efficient, flexible, and easy-to-use solutions, enabling developers to quickly build a variety of multimedia processing applications.<br />
<br />
== Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
===Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter 1 and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a # character at the beginning of the line with ……app.py >> …… to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer?''' Enter Y and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the lower machine continues to communicate with the upper machine through the serial port, the upper machine may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the upper machine is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Example==<br />
The following code block can be run directly:<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press '''STOP''' to close the real-time video and release the camera resources.<br />
<br />
=== If you cannot see the real-time camera feed when running:===<br />
*Click on Kernel -> Shut down all kernels above.<br />
*Close the current section tab and open it again.<br />
*Click '''STOP''' to release the camera resources, then run the code block again.<br />
*Reboot the device.<br />
=== Features of this Section===<br />
When the code block runs successfully, you can place your hand in front of the camera, and the real-time video frame will display annotations indicating the joints of the hand. These annotations will change with the movement of your hand, and the positions of each joint will be outputted as well, facilitating further development for gesture control.<br><br />
MediaPipe's gesture recognition process uses different names to correspond to different joints. You can retrieve the position information of a joint by calling its corresponding number.<br />
====MediaPipe Han====<br />
d<br />
1.WRIST<br />
<br />
2.THUMB_CMC<br />
<br />
3.THUMB_MCP<br />
<br />
4.THUMB_IP<br />
<br />
5.THUMB_TIP<br />
<br />
6.INDEX_FINGER_MCP<br />
<br />
7.INDEX_FINGER_PIP<br />
<br />
8.INDEX_FINGER_DIP<br />
<br />
9.INDEX_FINGER_TIP<br />
<br />
10.MIDDLE_FINGER_MCP<br />
<br />
11.MIDDLE_FINGER_PIP<br />
<br />
12.MIDDLE_FINGER_DIP<br />
<br />
13.MIDDLE_FINGER_TIP<br />
<br />
14.RING_FINGER_MCP<br />
<br />
15.RING_FINGER_PIP<br />
<br />
16.RING_FINGER_DIP<br />
<br />
17.RING_FINGER_TIP<br />
<br />
18.PINKY_MCP<br />
<br />
19.PINKY_PIP<br />
<br />
20.PINKY_DIP<br />
<br />
21.PINKY_TIP<br />
<syntaxhighlight lang="python"><br />
import cv2<br />
import imutils, math<br />
from picamera2 import Picamera2 # access Raspberry Pi Camera library<br />
from IPython.display import display, Image # Display images on Jupyter Notebook <br />
import ipywidgets as widgets # Widgets for creating interactive interfaces, such as buttons<br />
import threading #Used to create new threads for asynchronous execution of tasks<br />
import mediapipe as mp #Import MediaPipe library for hand critical point detection<br />
<br />
<br />
# Create a "Stop" button that the user can click to stop the video stream.<br />
# ================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
# Initialize MediaPipe drawing tool and hand critical point detection model <br />
mpDraw = mp.solutions.drawing_utils<br />
<br />
mpHands = mp.solutions.hands<br />
hands = mpHands.Hands(max_num_hands=1) # Initialize hand landmark detection model, up to one hand <br />
<br />
# Define display functions to process video frames and perform hand landmark detection<br />
def view(button):<br />
# picam2 = Picamera2() # Create Picamera2 example<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)})) # Configure camera parameters <br />
# picam2.start() # Boot the camera <br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle=display(None, display_id=True) # Creates a display handle for updating the displayed image<br />
<br />
while True:<br />
# frame = picam2.capture_array()<br />
_, frame = camera.read()<br />
# frame = cv2.flip(frame, 1) # if your camera reverses your image<br />
<br />
# frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)<br />
<br />
results = hands.process(img)<br />
<br />
# If the hand landmark is detected<br />
if results.multi_hand_landmarks:<br />
for handLms in results.multi_hand_landmarks: # Iterate over each hand detected<br />
# Drawing the hand landmark<br />
for id, lm in enumerate(handLms.landmark):<br />
h, w, c = img.shape<br />
cx, cy = int(lm.x * w), int(lm.y * h) # Calculate the position of the key point in the image<br />
cv2.circle(img, (cx, cy), 5, (255, 0, 0), -1) # Drawing dots at key point locations<br />
<br />
<br />
frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)<br />
mpDraw.draw_landmarks(frame, handLms, mpHands.HAND_CONNECTIONS) # Drawing hand skeleton connecting lines<br />
frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) <br />
<br />
target_pos = handLms.landmark[mpHands.HandLandmark.INDEX_FINGER_TIP]<br />
<br />
_, frame = cv2.imencode('.jpeg', frame)<br />
display_handle.update(Image(data=frame.tobytes()))<br />
if stopButton.value==True:<br />
picam2.close()<br />
display_handle.update(None)<br />
<br />
# Display the "Stop" button and start the thread that displays the function.<br />
# ================<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start()<br />
<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/19_OpenCV_Color_Recognition19 OpenCV Color Recognition2024-03-21T10:41:24Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
In this tutorial, we'll integrate some functions to modify frame images within OpenCV, such as blurring, color space conversion, erosion, and dilation.<br />
==Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
=== Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter 1 and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a # character at the beginning of the line with '''……app.py >> ……''' to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer?''' Enter '''Y''' and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the lower machine continues to communicate with the upper machine through the serial port, the host may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the host is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: sudo reboot.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Example==<br />
The following code block can be run directly:<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press STOP to close the real-time video and release the camera resources.<br />
=== If you cannot see the real-time camera feed when running:===<br />
*Click on Kernel -> Shut down all kernels above.<br />
*Close the current section tab and open it again.<br />
* Click `STOP` to release the camera resources, then run the code block again.<br />
*Reboot the device.<br />
=== Execution===<br />
By default, we detect blue balls in the example. Ensure that there are no blue objects in the background to avoid interfering with the color recognition function. You can also modify the detection color (in the HSV color space) through secondary development.<br />
<syntaxhighlight lang="python"><br />
import cv2<br />
import imutils<br />
from picamera2 import Picamera2 # Library for accessing Raspberry Pi Camera<br />
import numpy as np # Library for mathematical calculations<br />
from IPython.display import display, Image # Library for displaying images in Jupyter Notebook<br />
import ipywidgets as widgets # Library for creating interactive widgets such as buttons<br />
import threading # Library for creating new threads to execute tasks asynchronously<br />
<br />
# Create a "Stop" button that users can click to stop the video stream<br />
# ================================================================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
# Define the display function to process video frames and recognize objects of specific colors<br />
def view(button):<br />
# picam2 = Picamera2() # Create an instance of Picamera2<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)})) # Configure camera parameters<br />
# picam2.start() # Start the camera<br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle=display(None, display_id=True) # Create a display handle to update displayed images<br />
i = 0<br />
<br />
# Define the color range to be detected<br />
color_upper = np.array([120, 255, 220])<br />
color_lower = np.array([90, 120, 90])<br />
min_radius = 12 # Define the minimum radius for detecting objects<br />
<br />
while True:<br />
# img = picam2.capture_array() # Capture a frame from the camera<br />
_, img = camera.read()<br />
# frame = cv2.flip(frame, 1) # if your camera reverses your image<br />
<br />
# frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
blurred = cv2.GaussianBlur(img, (11, 11), 0) # Apply Gaussian blur to the image to remove noise<br />
hsv = cv2.cvtColor(blurred, cv2.COLOR_BGR2HSV) # Convert the image from BGR to HSV color space<br />
mask = cv2.inRange(hsv, color_lower, color_upper) # Create a mask to retain only objects within a specific color range<br />
mask = cv2.erode(mask, None, iterations=5) # Apply erosion to the mask to remove small white spots<br />
mask = cv2.dilate(mask, None, iterations=5) # Apply dilation to the mask to highlight the object regions<br />
<br />
# Find contours in the mask<br />
cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)<br />
cnts = imutils.grab_contours(cnts) # Extract contours<br />
center = None # Initialize the center of the object<br />
<br />
if len(cnts) > 0:<br />
# Find the largest contour in the mask, then use<br />
# it to compute the minimum enclosing circle and<br />
# centroid<br />
c = max(cnts, key=cv2.contourArea) # Find the largest contour<br />
((x, y), radius) = cv2.minEnclosingCircle(c) # Compute the minimum enclosing circle of the contour<br />
M = cv2.moments(c) # Compute the moments of the contour<br />
center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"])) # Compute the center of the contour based on moments<br />
<br />
if radius > min_radius: # If the radius of the minimum enclosing circle is greater than the predefined minimum radius, draw circles and center points<br />
cv2.circle(img, (int(x), int(y)), int(radius), (128, 255, 255), 1) # Draw the minimum enclosing circle<br />
<br />
_, frame = cv2.imencode('.jpeg', img) # Encode the frame to JPEG format<br />
display_handle.update(Image(data=frame.tobytes())) # Update the displayed image<br />
if stopButton.value==True: # Check if the "Stop" button has been pressed<br />
picam2.close() # If so, close the camera<br />
display_handle.update(None) # Clear the displayed content<br />
<br />
<br />
# Display the "Stop" button and start a thread to execute the display function<br />
# ================================================================<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start()<br />
<br />
<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/18_Object_Recognition_Based_on_DNN_(Deep_Neural_Network)18 Object Recognition Based on DNN (Deep Neural Network)2024-03-21T10:38:24Z<p>Eng52: /* Terminate the Main Program */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This section introduces how to implement common object recognition using DNN (Deep Neural Network) + OpenCV.<br />
<br />
==Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
<br />
===Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter '''1''' and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a '''# ''' character at the beginning of the line with '''……app.py >> ……''' to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer?''' Enter '''Y ''' and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the sub-controller continues to communicate with the host through the serial port, the host may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the host is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Example==<br />
The following code block can be run directly:<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press '''STOP''' to close the real-time video and release the camera resources.<br />
=== If you cannot see the real-time camera feed when running:===<br />
* Click on Kernel -> Shut down all kernels above.<br />
* Close the current section tab and open it again.<br />
* Click '''STOP''' to release the camera resources, then run the code block again.<br />
* Reboot the device.<br />
===Features of This Section===<br />
The `deploy.prototxt` file and `mobilenet_iter_73000.caffemodel` file are in the same path as this .ipynb file.<br><br />
When the code blocks run successfully, you can point the camera at common objects such as: "background", "aeroplane", "bicycle", "bird", "boat", "bottle", "bus", "car", "cat", "chair", "cow", "diningtable", "dog", "horse", "motorbike", "person", "pottedplant", "sheep", "sofa", "train", "tvmonitor".<br><br />
The program will annotate the objects it recognizes in the image and label them with their names.<br />
<syntaxhighlight lang="python"><br />
import cv2 # Import the OpenCV library for image processing<br />
from picamera2 import Picamera2 # Library for accessing Raspberry Pi Camera<br />
import numpy as np # Library for mathematical calculations<br />
from IPython.display import display, Image # Library for displaying images in Jupyter Notebook<br />
import ipywidgets as widgets # Library for creating interactive widgets like buttons<br />
import threading # Library for creating new threads for asynchronous task execution<br />
<br />
# Pre-defined class names based on the Caffe model<br />
class_names = ["background", "aeroplane", "bicycle", "bird", "boat",<br />
"bottle", "bus", "car", "cat", "chair", "cow", "diningtable",<br />
"dog", "horse", "motorbike", "person", "pottedplant", "sheep",<br />
"sofa", "train", "tvmonitor"]<br />
<br />
# Load the Caffe model<br />
net = cv2.dnn.readNetFromCaffe('deploy.prototxt', 'mobilenet_iter_73000.caffemodel')<br />
<br />
# Create a "Stop" button for users to stop the video stream by clicking it<br />
# ================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
<br />
# Define the display function to process video frames and perform object detection<br />
# ================<br />
def view(button):<br />
# picam2 = Picamera2() # Create an instance of Picamera2<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)})) # Configure camera parameters<br />
# picam2.start() # Start the camera<br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle=display(None, display_id=True) # Create a display handle for updating displayed images<br />
i = 0<br />
<br />
avg = None<br />
<br />
while True:<br />
# frame = picam2.capture_array() # Capture a frame from the camera<br />
_, frame = camera.read()<br />
# frame = cv2.flip(frame, 1) # if your camera reverses your image<br />
<br />
# uncomment this line if you are using USB camera<br />
# frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR) # Convert the image from RGB to BGR because OpenCV uses BGR by default<br />
(h, w) = img.shape[:2] # Get the height and width of the image<br />
# Generate the input blob for the network<br />
blob = cv2.dnn.blobFromImage(cv2.resize(img, (300, 300)), 0.007843, (300, 300), 127.5)<br />
net.setInput(blob) # Set the blob as the input to the network<br />
detections = net.forward() # Perform forward pass to get the detection results<br />
<br />
# Loop over the detected objects<br />
for i in range(0, detections.shape[2]):<br />
confidence = detections[0, 0, i, 2] # Get the confidence of the detected object<br />
if confidence > 0.2: # If the confidence is above the threshold, process the detected object<br />
idx = int(detections[0, 0, i, 1]) # Get the class index<br />
box = detections[0, 0, i, 3:7] * np.array([w, h, w, h]) # Get the bounding box of the object<br />
(startX, startY, endX, endY) = box.astype("int") # Convert the bounding box to integers<br />
<br />
# Annotate the object and confidence on the image<br />
label = "{}: {:.2f}%".format(class_names[idx], confidence * 100)<br />
cv2.rectangle(frame, (startX, startY), (endX, endY), (0, 255, 0), 2)<br />
y = startY - 15 if startY - 15 > 15 else startY + 15<br />
cv2.putText(frame, label, (startX, y), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0), 2)<br />
<br />
_, frame = cv2.imencode('.jpeg', frame) # Encode the frame as JPEG format<br />
display_handle.update(Image(data=frame.tobytes())) # Update the displayed image<br />
if stopButton.value==True: # Check if the "Stop" button is pressed<br />
picam2.close() # If yes, close the camera<br />
display_handle.update(None) # Clear the displayed content<br />
<br />
<br />
# Display the "Stop" button and start the display function's thread<br />
# ================<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start()<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/17_Face_Recognition_Based_on_OpenCV17 Face Recognition Based on OpenCV2024-03-21T10:35:09Z<p>Eng52: /* Example */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This chapter introduces how to use OpenCV to compare feature databases and achieve face recognition. Although this method is not as efficient as MediaPipe's solution, it allows for the detection of other objects by replacing the feature database file.<br />
==Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
<br />
=== Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter '''1''' and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a '''#''' character at the beginning of the line with '''……app.py >> ……''' to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer?''' Enter Y and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the sub-controller continues to communicate with the upper machine through the serial port, the host may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the upper machine is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
==Example==<br />
The following code block can be run directly:<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press '''STOP''' to close the real-time video and release the camera resources.<br />
<br />
=== If you cannot see the real-time camera feed when running:===<br />
* Click on Kernel -> Shut down all kernels above.<br />
* Close the current section tab and open it again.<br />
* Click '''STOP''' to release the camera resources, then run the code block again.<br />
* Reboot the device.<br />
<br />
=== Features of This Chapter===<br />
The face feature database file is located in the same path as this .ipynb file. You can change the faceCascade variable to modify what needs to be detected. You'll need to replace the current `haarcascade_frontalface_default.xml` file with other feature files.<br><br />
When the code block runs successfully, you can position the robot's camera on a face, and the area containing the face will be automatically highlighted on the screen.<br />
<syntaxhighlight lang="python"><br />
import cv2 # Import the OpenCV library for image processing<br />
from picamera2 import Picamera2 # Library for accessing the Raspberry Pi Camera<br />
import numpy as np # Library for mathematical calculations<br />
from IPython.display import display, Image # Library for displaying images in Jupyter Notebook<br />
import ipywidgets as widgets # Library for creating interactive widgets like buttons<br />
import threading # Library for creating new threads to execute tasks asynchronously<br />
<br />
# Load the Haar cascade classifier for face detection<br />
faceCascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')<br />
<br />
# Create a "Stop" button for users to stop the video stream by clicking on it<br />
# ================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
<br />
# Define a display function to process video frames and perform face detection<br />
# ================<br />
def view(button):<br />
# picam2 = Picamera2() # Create an instance of Picamera2<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)})) # Configure camera parameters<br />
# picam2.start() # Start the camera<br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle=display(None, display_id=True) # Create a display handle to update the displayed image<br />
i = 0<br />
<br />
avg = None<br />
<br />
while True:<br />
# frame = picam2.capture_array()<br />
_, frame = camera.read()<br />
# frame = cv2.flip(frame, 1) # if your camera reverses your image<br />
<br />
# frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR) # Convert the image from RGB to BGR because OpenCV defaults to BGR<br />
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) # Convert the image to grayscale because face detection is typically performed on grayscale images<br />
<br />
# Perform face detection using the cascade classifier<br />
faces = faceCascade.detectMultiScale(<br />
gray, <br />
scaleFactor=1.2,<br />
minNeighbors=5, <br />
minSize=(20, 20)<br />
)<br />
<br />
if len(faces):<br />
for (x,y,w,h) in faces: # Loop through all detected faces<br />
cv2.rectangle(frame,(x,y),(x+w,y+h),(64,128,255),1) # Draw a rectangle around the detected face<br />
<br />
_, frame = cv2.imencode('.jpeg', frame) # Encode the frame as JPEG format<br />
display_handle.update(Image(data=frame.tobytes()))<br />
if stopButton.value==True:<br />
picam2.close()<br />
display_handle.update(None)<br />
<br />
<br />
# Display the "Stop" button and start a thread to execute the display function<br />
# ================<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start()<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/16_Controlling_Photo_Capture_with_Buttons16 Controlling Photo Capture with Buttons2024-03-21T10:31:28Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This tutorial teaches how to control the camera for taking photos and recording videos by adding buttons to the page. Similar to previous tutorials, images are by default saved in the static folder, and videos are saved in the '''videos''' folder.<br />
== Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
<br />
=== Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter '''1''' and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a '''#''' character at the beginning of the line with '''……app.py >> ……''' to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you "Save modified buffer?" Enter Y and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the sub-controller continues to communicate with the upper machine through the serial port, the host may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the upper machine is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Example==<br />
The following code block can be run directly:<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press '''STOP''' to close the real-time video and release the camera resources.<br />
<br />
=== If you cannot see the real-time camera feed when running:===<br />
* Click on Kernel -> Shut down all kernels above.<br />
* Close the current section tab and open it again.<br />
* Click '''STOP''' to release the camera resources, then run the code block again.<br />
* Reboot the device.<br />
<br />
===Notes===<br />
If you are using a USB camera, you need to uncomment the line '''frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)'''.<br />
=== Running===<br />
When the code block is executed, you can take a photo by clicking on PHOTO.<br />
<syntaxhighlight lang="python"><br />
import cv2 # Import the OpenCV library for image processing<br />
from picamera2 import Picamera2 # Import the library to access the Raspberry Pi Camera<br />
import numpy as np # Import the library for mathematical calculations<br />
from IPython.display import display, Image # Import to display images in Jupyter Notebook<br />
import ipywidgets as widgets # Import to create interactive interface widgets like buttons<br />
import threading # Import to create new threads for asynchronous task execution<br />
<br />
import os, time # Import for file and directory operations and time-related functions<br />
<br />
time_interval = 3 # Set the time interval for taking photos (seconds)<br />
<br />
photo_path = '/home/ws/ugv_pt_rpi/static/' # Set the directory path to store photos and videos<br />
<br />
# Create a "Stop" button for users to stop video capture and photo taking<br />
# ================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # Set button style: 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # Set button icon (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
# Create a "Photo" button for users to instantly take a photo by clicking it<br />
# ================<br />
photoButton = widgets.ToggleButton(<br />
value=False,<br />
description='Photo',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # (FontAwesome names without the `fa-` prefix)<br />
)<br />
<br />
# Set the time interval for continuous shooting (seconds)<br />
time_interval = 3<br />
photo_num_count = 0 # Initialize the photo counter<br />
capture_lock = threading.Lock()<br />
last_photo_time = time.time() # Record the last time a photo was taken<br />
<br />
<br />
def photo_button_clicked(change, frame):<br />
global photo_num_count<br />
if change['new']: # When the "Photo" button is clicked<br />
photo_num_count += 1 # Increment the photo counter<br />
photo_filename = f'{photo_path}photo_{photo_num_count}.jpg' # Set the path and filename for saving the photo<br />
cv2.imwrite(photo_filename, frame) # Save the photo<br />
print(f'{photo_num_count} photos saved. New photo: {photo_filename}') # Print photo save information<br />
photoButton.value = False # Reset the status of the "Photo" button<br />
<br />
<br />
# Define a display function to capture and display video frames and respond to photo capture requests<br />
# ================<br />
def view(stop_button, photo_button):<br />
# picam2 = Picamera2() # Create an instance of Picamera2<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)})) # Configure camera parameters<br />
# picam2.start()<br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle = display(None, display_id=True) # Create a display handle to update the displayed image<br />
i = 0<br />
while True:<br />
#frame = picam2.capture_array() # Capture a frame from the camera<br />
_, frame = camera.read()<br />
<br />
photoButton.observe(lambda change: photo_button_clicked(change, frame), names='value') # Listen for clicks on the "Photo" button<br />
<br />
_, frame = cv2.imencode('.jpeg', frame) # Encode the frame as JPEG format<br />
display_handle.update(Image(data=frame.tobytes()))<br />
if stopButton.value:<br />
picam2.close()<br />
display_handle.update(None)<br />
<br />
<br />
# Display the "Stop" and "Photo" buttons and start a new thread to execute the display function<br />
# ================<br />
display(stopButton)<br />
display(photoButton)<br />
thread = threading.Thread(target=view, args=(stopButton, photoButton,))<br />
thread.start()<br />
</syntaxhighlight><br />
Here's something to note: due to stability issues with JupyterLab components used in this example, pressing the "Photo" button may result in saving multiple photos. You can navigate to ugv_pt_rpi/static/ in the left sidebar of JupyterLab to view the captured photos.</div>Eng52https://www.waveshare.com/wiki/15_OpenCV_Motion_Detection15 OpenCV Motion Detection2024-03-21T10:11:35Z<p>Eng52: /* Example */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This tutorial utilizes OpenCV to detect changes in the scene. You can set a threshold for how much change is detected, and adjusting this threshold allows you to modify the sensitivity of the motion detection.<br />
<br />
This chapter requires an understanding of the preceding chapters.<br />
<br />
==Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
=== Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter '''1''' and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a # character at the beginning of the line with '''……app.py >> ……''' to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you "Save modified buffer?" Enter Y and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the sub-controller continues to communicate with the upper machine through the serial port, the host may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the hsot is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Example==<br />
The following code block can be run directly:<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press '''STOP''' to close the real-time video and release the camera resources.<br />
<br />
===If you cannot see the real-time camera feed when running:===<br />
* Click on Kernel -> Shut down all kernels above.<br />
* Close the current section tab and open it again.<br />
* Click '''STOP''' to release the camera resources, then run the code block again.<br />
* Reboot the device.<br />
===Notes===<br />
If you are using a USB camera, you need to uncomment the line '''frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)'''.<br />
<br />
===Features of This Chapter===<br />
You need to adjust some parameters to increase the '''threshold''' (sensitivity) of OpenCV for detecting changes in the scene. The lower the threshold value, the more sensitive OpenCV is to changes in the scene.<br />
===Running===<br />
When you run the code block, you can see the real-time feed from the camera. You can wave your hand in front of the camera, and the program will automatically outline the areas of change with green boxes.<br />
<syntaxhighlight lang="python"><br />
<br />
import cv2<br />
from picamera2 import Picamera2<br />
import numpy as np<br />
from IPython.display import display, Image<br />
import ipywidgets as widgets<br />
import threading<br />
<br />
import imutils # Library for simplifying image processing tasks<br />
<br />
threshold = 2000 # Set the threshold for motion detection<br />
<br />
# Create a "Stop" button to control the process<br />
# ===================================================<br />
stopButton = widgets.ToggleButton(<br />
value=False,<br />
description='Stop',<br />
disabled=False,<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description',<br />
icon='square' # Button icon (FontAwesome name without the `fa-` prefix)<br />
)<br />
<br />
<br />
# Display function definition, used to capture and process video frames, while performing motion detection<br />
# ===================================================<br />
def view(button):<br />
# picam2 = Picamera2() # Create a Picamera2 instance<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)})) # Configure camera parameters<br />
# picam2.start() # Start the camera<br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle = display(None, display_id=True)<br />
i = 0<br />
<br />
avg = None # Used to store the average frame<br />
<br />
while True:<br />
# frame = picam2.capture_array() # Capture a frame from the camera<br />
_, frame = camera.read()<br />
# frame = cv2.flip(frame, 1) # if your camera reverses your image<br />
<br />
# uncomment this line if you are using USB camera<br />
# frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR) # Convert frame color from RGB to BGR<br />
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) # Convert the frame to grayscale<br />
gray = cv2.GaussianBlur(gray, (21, 21), 0) # Apply Gaussian blur to the grayscale image<br />
if avg is None: # If the average frame does not exist, create it<br />
avg = gray.copy().astype("float")<br />
continue<br />
<br />
try:<br />
cv2.accumulateWeighted(gray, avg, 0.5) # Update the average frame<br />
except:<br />
continue<br />
<br />
frameDelta = cv2.absdiff(gray, cv2.convertScaleAbs(avg)) # Calculate the difference between the current frame and the average frame<br />
<br />
# Apply a threshold to find contours in the difference image<br />
thresh = cv2.threshold(frameDelta, 5, 255, cv2.THRESH_BINARY)[1]<br />
thresh = cv2.dilate(thresh, None, iterations=2)<br />
cnts = cv2.findContours(thresh.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)<br />
cnts = imutils.grab_contours(cnts)<br />
# Iterate through contours<br />
for c in cnts:<br />
# Ignore contours that are too small<br />
if cv2.contourArea(c) < threshold:<br />
continue<br />
# Calculate the bounding box of the contour and draw a rectangle around it<br />
(mov_x, mov_y, mov_w, mov_h) = cv2.boundingRect(c)<br />
cv2.rectangle(frame, (mov_x, mov_y), (mov_x + mov_w, mov_y + mov_h), (128, 255, 0), 1) # Draw a rectangle around the moving area<br />
<br />
_, frame = cv2.imencode('.jpeg', frame) # Encode the processed frame in JPEG format<br />
display_handle.update(Image(data=frame.tobytes())) # Update the displayed image<br />
if stopButton.value == True: # Check if the "Stop" button is pressed<br />
picam2.close() # If yes, close the camera<br />
display_handle.update(None) # Clear the displayed image<br />
<br />
<br />
# Display the stop button and start the video stream display thread<br />
# ===================================================<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start()<br />
<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/14_Time-lapse_Photography14 Time-lapse Photography2024-03-21T10:07:19Z<p>Eng52: /* Example */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This chapter builds upon the previous tutorial, capturing frames from the camera at regular intervals and saving them in the `static` folder within the `ugv_pt_rpi` directory.<br />
<br />
==Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
=== Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter '''1''' and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a '''#''' character at the beginning of the line with '''……app.py >> ……''' to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer?''' Enter '''Y''' and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the sub-controller continues to communicate with the host through the serial port, the host may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the upper machine is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
==Example==<br />
The following code block can be run directly:<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press '''STOP''' to close the real-time video and release the camera resources.<br />
=== If you cannot see the real-time camera feed when running:===<br />
* Click on Kernel -> Shut down all kernels above.<br />
* Close the current section tab and open it again.<br />
* Click '''STOP''' to release the camera resources, then run the code block again.<br />
* Reboot the device.<br />
<br />
=== Notes===<br />
If you are using a USB camera, you need to uncomment the line '''frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)'''.<br />
<br />
=== Differences from the Previous Chapter===<br />
You can adjust the value of `time_intervel` to change the interval between photos, measured in seconds.<br><br />
The photos you capture will be stored in the `/ugv_pt_rpi/static/` folder..<br />
<syntaxhighlight lang="python"><br />
import cv2 # Import the OpenCV library for image processing<br />
from picamera2 import Picamera2 # Import the Picamera2 library to access the Raspberry Pi Camera<br />
import numpy as np<br />
from IPython.display import display, Image # Import IPython display functionality<br />
import ipywidgets as widgets # Import the ipywidgets library for creating cloud interactive widgets<br />
import threading # Import the threading library for multithreading<br />
<br />
import os, time # Import the os and time libraries for file operations and time-related functionalities<br />
<br />
# Change the interval time for taking photos here (in seconds)<br />
time_intervel = 3 # Take a photo every 3 seconds<br />
<br />
# Set the path for saving the images<br />
# You can change the save path here<br />
photo_path = '/home/ws/ugv_pt_rpi/static/'<br />
<br />
# Create a toggle button as a stop button<br />
# ================<br />
stopButton = widgets.ToggleButton(<br />
value=False, # The initial state of the button is unselected<br />
description='Stop', # Text displayed on the button<br />
disabled=False, # The button is initially enabled<br />
button_style='danger', # 'success', 'info', 'warning', 'danger' or ''<br />
tooltip='Description', # Tooltip displayed when hovering over the button<br />
icon='square' # Button icon (FontAwesome name without the `fa-` prefix)<br />
)<br />
<br />
<br />
# Define a function for displaying the video stream and taking photos at regular intervals<br />
# ================<br />
def view(button):<br />
last_picture_time = time.time() # Record the time of the last photo taken<br />
num_count = 0 # Initialize the photo counter<br />
<br />
# picam2 = Picamera2() # Create an instance of Picamera2<br />
# Configure camera parameters, set video format and size<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)}))<br />
# picam2.start() # Start the camera<br />
<br />
camera = cv2.VideoCapture(-1) <br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle=display(None, display_id=True) # Create a display handle for updating the displayed content<br />
<br />
i = 0<br />
while True:<br />
# frame = picam2.capture_array()<br />
_, frame = camera.read()<br />
# frame = cv2.flip(frame, 1) # Flip the image<br />
<br />
# Uncomment the following line if using a USB camera<br />
frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)<br />
<br />
# Take a photo every few seconds<br />
if time.time() - last_picture_time >= time_intervel:<br />
num_count = num_count + 1 # Update the photo counter<br />
photo_filename = f'{photo_path}photo_{num_count}.jpg' # Define the file name for the photo<br />
cv2.imwrite(photo_filename, frame) # Save the photo to the specified path<br />
last_picture_time = time.time() # Update the time of the last photo taken<br />
print(f'{num_count} photos saved. new photo: {photo_filename}') # Print information about the saved photo<br />
<br />
_, frame = cv2.imencode('.jpeg', frame) # Encode the frame as JPEG format<br />
display_handle.update(Image(data=frame.tobytes())) # Update the displayed image<br />
if stopButton.value==True: # Check if the stop button is pressed<br />
picam2.close()<br />
display_handle.update(None)<br />
<br />
<br />
# Display the stop button and start the video stream display thread<br />
# ================<br />
display(stopButton)<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start()<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/13_Displaying_Real-Time_Video_Stream_in_Jupyter_Lab13 Displaying Real-Time Video Stream in Jupyter Lab2024-03-21T10:02:37Z<p>Eng52: /* Example */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
In the previous chapter, we used Flask to display the real-time camera feed, a method that required opening a new tab in the browser or accessing it from another device. In this chapter, we'll explore a solution for viewing the real-time video stream directly in Jupyter Lab.<br />
== Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
=== Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter 1 and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a '''#''' character at the beginning of the line with '''……app.py >> ……''' to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer?''' Enter '''Y ''' and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the lower machine continues to communicate with the upper machine through the serial port, the upper machine may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the upper machine is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Example==<br />
The following code block can be run directly:<br />
:1. Select the code block below.<br />
:2. Press Shift + Enter to run the code block.<br />
:3. Watch the real-time video window.<br />
:4. Press STOP to close the real-time video and release the camera resources.<br />
===If you cannot see the real-time camera feed when running:===<br />
* Click on Kernel -> Shut down all kernels above.<br />
* Close the current section tab and open it again.<br />
* Click '''STOP''' to release the camera resources, then run the code block again.<br />
* Reboot the device.<br />
<syntaxhighlight lang="python"><br />
import matplotlib.pyplot as plt # Import the matplotlib library for plotting<br />
import cv2 # Import the OpenCV library for image processing<br />
from picamera2 import Picamera2 # Import the Picamera2 library for accessing the Raspberry Pi Camera<br />
import numpy as np # Import the NumPy library for mathematical computations<br />
from IPython.display import display, Image # Import IPython display functionality<br />
import ipywidgets as widgets # Import the ipywidgets library for creating interactive widgets<br />
import threading # Import the threading library for multithreading<br />
<br />
# Create a toggle button as a stop button<br />
stopButton = widgets.ToggleButton(<br />
value=False, # The initial state of the button is unselected<br />
description='Stop', # Text displayed on the button<br />
disabled=False, # The button is initially enabled<br />
button_style='danger', # The button style is red<br />
tooltip='Description', # Tooltip displayed when hovering over the button<br />
icon='square' # Icon displayed on the button<br />
)<br />
<br />
# Define a function for displaying the video stream<br />
def view(button):<br />
# picam2 = Picamera2() # Create an instance of Picamera2<br />
# Configure camera parameters, set video format and size<br />
# picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)}))<br />
# picam2.start() # Start the camera<br />
camera = cv2.VideoCapture(-1)<br />
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 640)<br />
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)<br />
<br />
display_handle=display(None, display_id=True) # Create a display handle for updating the displayed content<br />
while True:<br />
# frame = picam2.capture_array() # Capture a frame from the camera<br />
_, frame = camera.read()<br />
<br />
# You can perform frame processing here if needed (e.g., flipping, color conversion, etc.)<br />
<br />
_, frame = cv2.imencode('.jpeg', frame) # Encode the frame as JPEG format<br />
display_handle.update(Image(data=frame.tobytes())) # Update the displayed image<br />
if stopButton.value==True: # Check if the stop button is pressed<br />
picam2.close() # If yes, close the camera<br />
display_handle.update(None) # Clear the displayed content<br />
<br />
# Display the stop button<br />
display(stopButton)<br />
# Create and start a thread, with the target function as view and the stop button as the argument<br />
thread = threading.Thread(target=view, args=(stopButton,))<br />
thread.start() # Start the thread<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/12_Image_Transmission_Based_on_Flask12 Image Transmission Based on Flask2024-03-21T09:50:23Z<p>Eng52: /* Web */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This chapter introduces how to use Flask to create a web application for displaying real-time video from the robot's camera. Due to the cross-platform nature of web applications, users can watch the camera's real-time video on devices such as smartphones, PCs, tablets, etc., through a browser, achieving wireless image transmission functionality.<br />
<br />
==What is Flask?==<br />
Flask is a lightweight web application framework used to quickly build web applications using Python.<br />
* Lightweight: Flask is a lightweight framework with a relatively small core library, but it offers enough flexibility and extensibility for developers to choose and add the extensions and libraries they need.<br />
* Simple and Easy to Use: Flask is designed to be simple and easy to use. Its API is clear and well-documented, allowing developers to quickly get started and build web applications rapidly.<br />
* Routing System: Flask uses decorators to define URL routes, mapping requests to corresponding handler functions. This makes creating different pages and handling different requests intuitive and straightforward.<br />
* Template Engine: Flask integrates with the Jinja2 template engine, making it easier to build dynamic content within the application. The template engine allows you to embed dynamically generated content in HTML.<br />
* Integrated Development Server: Flask comes with a simple integrated development server for easy development and debugging. However, in production environments, it is recommended to use more powerful web servers such as Gunicorn or uWSGI.<br />
* Plugins and Extensions: Flask supports many plugins and extensions for adding additional functionality, such as database integration, authentication, form handling, etc.<br />
* RESTful Support: Flask provides good support for RESTful-style APIs, making it simple to build and design RESTful APIs.<br />
* WSGI Compatibility: Flask is based on the Web Server Gateway Interface (WSGI), allowing it to run on many web servers that comply with the WSGI standard.<br />
* Active Community: Flask has a large and active community, meaning you can easily find extensive documentation, tutorials, third-party extensions, and support.d support.<br />
<br />
==Preparation==<br />
Since the product automatically runs the main program at startup, which occupies the camera resource, this tutorial cannot be used in such situations. You need to terminate the main program or disable its automatic startup before restarting the robot.<br><br />
It's worth noting that because the robot's main program uses multi-threading and is configured to run automatically at startup through crontab, the usual method sudo killall python typically doesn't work. Therefore, we'll introduce the method of disabling the automatic startup of the main program here.<br />
===Terminate the Main Program===<br />
:1. Click the "+" icon next to the tab for this page to open a new tab called "Launcher."<br />
:2. Click on "Terminal" under "Other" to open a terminal window.<br />
:3. Type '''bash''' into the terminal window and press Enter.<br />
:4. Now you can use the Bash Shell to control the robot.<br />
:5. Enter the command: '''crontab -e'''.<br />
:6. If prompted to choose an editor, enter ''' 1 ''' and press Enter to select nano.<br />
:7. After opening the crontab configuration file, you'll see the following two lines:<br />
<syntaxhighlight lang="python"><br />
@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:8. Add a # character at the beginning of the line with ……app.py >> …… to comment out this line.<br />
<syntaxhighlight lang="python"><br />
#@reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
:9. Press Ctrl + X in the terminal window to exit. It will ask you '''Save modified buffer? ''' Enter '''Y''' and press Enter to save the changes.<br />
:10. Reboot the device. Note that this process will temporarily close the current Jupyter Lab session. If you didn't comment out '''……start_jupyter.sh >>……''' in the previous step, you can still use Jupyter Lab normally after the robot reboots (JupyterLab and the robot's main program app.py run independently). You may need to refresh the page.<br />
:11. One thing to note is that since the lower machine continues to communicate with the upper machine through the serial port, the upper machine may not start up properly during the restart process due to the continuous change of serial port levels. Taking the case where the upper machine is a Raspberry Pi, after the Raspberry Pi is shut down and the green LED is constantly on without the green LED blinking, you can turn off the power switch of the robot, then turn it on again, and the robot will restart normally.<br />
:12. Enter the reboot command: '''sudo reboot'''.<br />
:13. After waiting for the device to restart (during the restart process, the green LED of the Raspberry Pi will blink, and when the frequency of the green LED blinking decreases or goes out, it means that the startup is successful), refresh the page and continue with the remaining part of this tutorial.<br />
<br />
== Web Application Example==<br />
===Note: The following code block cannot be run in Jupyter Lab===<br />
Due to potential conflicts in port usage between the Flask application and Jupyter Lab, the following code cannot be run in Jupyter Lab. The code is stored in the 12 folder within the `tutorial_cn` and `tutorial_en` directories. Inside the `12` folder, there is also a folder named `template` for storing web resources. Below are the steps to run the example:<br><br />
:1. Open a terminal using the method described earlier. Make sure the terminal's default path matches the file path on the left. Navigate to the 12 folder by entering `cd 12` in the terminal.<br />
:2. Start the Flask web application server using the following command: '''python flask_camera.py'''.<br />
:3. Open a web browser on a device within the same local network (or open a new tab in the browser on the same device) and enter the Raspberry Pi's IP address followed by: 5000. For example, if the Raspberry Pi's IP address is `192.168.10.104`, enter `192.168.10.104:5000` in the browser's address bar. Note that it should be an English colon.<br />
:4. Use Ctrl + C in the terminal to end the running application.<br />
<br />
===Flask Example===<br />
<syntaxhighlight lang="python"><br />
from flask import Flask, render_template, Response # Import Flask class, render_template function for rendering HTML templates, Response class for generating response objects<br />
from picamera2 import Picamera2 # Import Picamera2 class from picamera2 library for accessing and controlling the camera<br />
import time # Import the time module for handling time-related tasks<br />
import cv2 # Import the OpenCV library for image processing<br />
<br />
app = Flask(__name__) # Create a Flask application instance<br />
<br />
def gen_frames(): # Define a generator function for generating frames captured by the camera<br />
picam2 = Picamera2() # Create an instance of Picamera2<br />
<br />
# Configure camera parameters, set video format and size<br />
picam2.configure(picam2.create_video_configuration(main={"format": 'XRGB8888', "size": (640, 480)}))<br />
<br />
picam2.start() # Start the camera<br />
while True:<br />
frame = picam2.capture_array() # Capture a frame from the camera<br />
<br />
ret, buffer = cv2.imencode('.jpg', frame) # Encode the captured frame as JPEG format<br />
<br />
frame = buffer.tobytes() # Convert the JPEG image to a byte stream<br />
<br />
# Use yield to return the image byte stream, allowing for continuous transmission of video frames to form a video stream<br />
yield (b'--frame\r\n'<br />
b'Content-Type: image/jpeg\r\n\r\n' + frame + b'\r\n')<br />
<br />
@app.route('/') # Define the root route<br />
def index():<br />
return render_template('index.html') # Return the index.html page<br />
<br />
@app.route('/video_feed') # Define the video stream route<br />
def video_feed():<br />
# Return a response object with video stream content, with content type multipart/x-mixed-replace<br />
return Response(gen_frames(), mimetype='multipart/x-mixed-replace; boundary=frame')<br />
<br />
if __name__ == '__main__':<br />
app.run(host='0.0.0.0', port=5000, debug=True) # Start the Flask application, listening on port 5000 on all network interfaces, with debug mode enabled<br />
<br />
</syntaxhighlight><br />
<br />
===Explanation of Key Parts of the Code===<br />
'''gen_frames()''': This is a generator function that continuously captures frames from the camera, encodes them into JPEG format, and yields the frame bytes as part of a multipart response. The generated frames are streamed in real-time to the client.<br><br />
<br />
'''@app.route('/')''': This decorator associates the index() function with the root URL (/). When a user accesses the root URL, it renders the HTML template named '12_index.html'.<br><br />
<br />
'''@app.route('/video_feed')''': This decorator associates the video_feed() function with the '/video_feed' URL. This route is used for real-time video streaming, where frames are sent as multipart responses.<br><br />
<br />
'''app.run(host='0.0.0.0', port=5000, debug=True)''': This line starts the Flask development server, listening on all available network interfaces (0.0.0.0) on port 5000. The debug=True option enables the server's debug mode.g mode.<br />
<br />
===Web===<br />
<syntaxhighlight lang="python"><br />
Comment:<br />
<!doctype html>: HTML document type declaration.<br />
<html lang="en">: The root element of the HTML document, specifying the page language as English.<br />
<head>: Contains metadata of the document, such as character set and page title.<br />
<!-- Required meta tags -->: HTML comment, indicating these are necessary meta tags.<br />
<meta charset="utf-8">: Specifies that the document uses the UTF-8 character set.<br />
<title>Live Video Based on Flask</title>: Sets the page title.<br />
<body>: Contains the visible content of the document.<br />
<!-- The image tag below is dynamically updated with the video feed from Flask -->: HTML comment, explaining that the image tag below will be dynamically updated to display the live video feed from Flask.<br />
<img src="{{ url_for('video_feed') }}">: Image tag that uses the video_feed route defined in Flask to fetch the live video stream.<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/11_Text-to-Speech_(TTS)11 Text-to-Speech (TTS)2024-03-21T09:45:27Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
Due to security considerations, it's not feasible to directly access audio devices through JupyterLab because of the environment's limitations. Therefore, the code blocks provided here are not intended for execution by users.<br><br />
The programs presented here originate from the product's main program file, audio_ctrl.py. You can refer to these code snippets to gain insight into how the product's main program facilitates text-to-speech functionality.<br />
<syntaxhighlight lang="python"><br />
import pyttsx3 # Importing the pyttsx3 library for text-to-speech functionality<br />
import threading # Importing the threading module for creating threads<br />
<br />
# Initializing the pyttsx3 engine<br />
engine = pyttsx3.init()<br />
<br />
# Creating an event object to control the synchronization of audio playback<br />
play_audio_event = threading.Event()<br />
<br />
# Setting the speed of voice playback<br />
engine.setProperty('rate', 180)<br />
<br />
# Defining a function to play voice for the given text<br />
def play_speech(input_text):<br />
engine.say(input_text) # Inputting the text into the engine<br />
engine.runAndWait() # Waiting for the voice output to complete<br />
play_audio_event.clear() # Clearing the event to indicate voice playback is complete<br />
<br />
def play_speech_thread(input_text):<br />
if play_audio_event.is_set(): # If a voice is already being played, return immediately to avoid overlapping playback<br />
return<br />
play_audio_event.set() # Setting the event to indicate a new voice playback task has started<br />
# Creating a new thread to play voice using the play_speech function<br />
speech_thread = threading.Thread(target=play_speech, args=(input_text,))<br />
speech_thread.start() # Starting the new thread to begin voice playback<br />
<br />
</syntaxhighlight><br />
<br />
The code utilizes the pyttsx3 library to achieve text-to-speech conversion and employs the threading module to create a thread for asynchronous voice playback. The function "play_speech()" is designed to play the specified text's voice in the main thread, while "play_speech_thread()" function facilitates voice playback in a new thread to prevent blocking the main thread. <br><br />
Also, "play_audio_event" controls the synchronization of voice playback to ensure that only one voice is playing at a time.</div>Eng52https://www.waveshare.com/wiki/10_Play_Audio_Files10 Play Audio Files2024-03-21T09:43:16Z<p>Eng52: Created page with "<div class="wiki-pages jet-green-color"> Due to security reasons, you cannot directly access the audio device through JupyterLab (environment limitations). The code blocks pro..."</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
Due to security reasons, you cannot directly access the audio device through JupyterLab (environment limitations). The code blocks provided here are not meant for user execution.<br><br />
<br />
The code presented here is extracted from the product's main program, audio_ctrl.py. You can refer to this code to understand how the main program implements the functionality of playing audio files.<br />
<br />
==Audio-related Functionality in the Product Main Program==<br />
Inside the folder of the product's main program, there is a folder named "sounds", which contains many subfolders: connected, others, recv_new_cmd, robot_started, searching_for_target, target_detected, target_locked.<br><br />
<br />
In the default program provided, only one audio file is placed in each of the connected and robot_started subfolders.<br><br />
<br />
Once the robot's main program starts running, it will automatically play a random audio file from the robot_started folder.<br><br />
<br />
When a client connects to this web application using a browser, a random audio file from the connected folder will be played automatically.<br><br />
<br />
You can place custom audio files in these folders as voice packs to customize your product further.<br />
<syntaxhighlight lang="python"><br />
import pygame # Import the pygame library for audio playback<br />
import random # Import the random library for random selection of audio files<br />
import yaml # Import the yaml library for reading configuration files<br />
import os # Import the os library for file operations<br />
import threading # Import the threading library for multithreading<br />
<br />
# Get the configuration file<br />
curpath = os.path.realpath(__file__)<br />
thisPath = os.path.dirname(curpath)<br />
with open(thisPath + '/config.yaml', 'r') as yaml_file:<br />
config = yaml.safe_load(yaml_file)<br />
<br />
# Initialize pygame.mixer and set the default volume for audio output<br />
pygame.mixer.init()<br />
pygame.mixer.music.set_volume(config['audio_config']['default_volume'])<br />
<br />
# Create an event object for controlling audio playback<br />
play_audio_event = threading.Event()<br />
<br />
# Get the minimum time between plays from the configuration file<br />
min_time_bewteen_play = config['audio_config']['min_time_bewteen_play']<br />
<br />
# Define a function to play an audio file<br />
def play_audio(input_audio_file):<br />
try:<br />
pygame.mixer.music.load(input_audio_file) # Load the audio file<br />
pygame.mixer.music.play() # Play the audio<br />
except:<br />
play_audio_event.clear() # Clear the event in case of an error<br />
return<br />
while pygame.mixer.music.get_busy(): # Wait for the audio to finish playing<br />
pass<br />
time.sleep(min_time_bewteen_play) # Wait for the minimum time between plays<br />
play_audio_event.clear() # Clear the event<br />
<br />
# Define a function to play a random audio file<br />
def play_random_audio(input_dirname, force_flag):<br />
if play_audio_event.is_set() and not force_flag:<br />
return<br />
# Get all audio files in the specified directory<br />
audio_files = [f for f in os.listdir(current_path + "/sounds/" + input_dirname) if f.endswith((".mp3", ".wav"))]<br />
# Choose a random audio file from the list<br />
audio_file = random.choice(audio_files)<br />
play_audio_event.set() # Set the event<br />
# Create a thread to play the audio<br />
audio_thread = threading.Thread(target=play_audio, args=(current_path + "/sounds/" + input_dirname + "/" + audio_file,))<br />
audio_thread.start() # Start the thread<br />
<br />
# Define a function to handle audio playback in a separate thread<br />
def play_audio_thread(input_file):<br />
if play_audio_event.is_set(): # If the event is already set, return<br />
return<br />
play_audio_event.set() # Set the event<br />
# Create a thread to play the audio<br />
audio_thread = threading.Thread(target=play_audio, args=(input_file,))<br />
audio_thread.start() # Start the thread<br />
<br />
# Define a function to play a specified audio file<br />
def play_file(audio_file):<br />
audio_file = current_path + "/sounds/" + audio_file<br />
play_audio_thread(audio_file)<br />
<br />
# Define a function to set the audio volume<br />
def set_audio_volume(input_volume):<br />
input_volume = float(input_volume) # Convert the input volume to a float<br />
if input_volume > 1: # If the volume is greater than 1, set it to 1<br />
input_volume = 1<br />
elif input_volume < 0: # If the volume is less than 0, set it to 0<br />
input_volume = 0<br />
pygame.mixer.music.set_volume(input_volume) # Set the volume<br />
<br />
# Define a function to set the minimum time between plays<br />
def set_min_time_between(input_time):<br />
global min_time_bewteen_play # Use the global variable<br />
min_time_bewteen_play = input_time # Set the minimum time between plays<br />
<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/09_Automatic_Command_Execution_upon_Booting09 Automatic Command Execution upon Booting2024-03-21T09:40:43Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This tutorial is aimed at demonstrating how the host controller automatically executes specific commands and communicates instructions to the slave device each time the system boots. The code blocks in this chapter are for comprehension only and are not executable. They serve to elucidate the automatic processes that the product undertakes upon startup. Should you find the need, these commands are subject to modification or expansion.<br />
<br />
== cmd_on_boot() Function==<br />
The cmd_on_boot() function, located within the main program of the product, defines a list of commands to be executed at startup. These commands facilitate initial configurations and set up essential operational parameters for the device.<br />
<syntaxhighlight lang="python"><br />
<br />
def cmd_on_boot():<br />
# List of commands to be executed at startup<br />
cmd_list = [<br />
'base -c {"T":142,"cmd":50}', # set feedback interval<br />
'base -c {"T":131,"cmd":1}', # serial feedback flow on<br />
'base -c {"T":143,"cmd":0}', # serial echo off<br />
'base -c {"T":4,"cmd":2}', # select the module - 0:None 1:RoArm-M2-S 2:Gimbal<br />
'base -c {"T":300,"mode":0,"mac":"EF:EF:EF:EF:EF:EF"}', # the base won't be ctrl by esp-now broadcast cmd, but it can still recv broadcast megs.<br />
'send -a -b' # add broadcast mac addr to peer<br />
]<br />
<br />
for i in range(0, len(cmd_list)):<br />
camera.cmd_process(cmd_list[i])<br />
</syntaxhighlight><br />
<br />
The control unit of the product can perform certain functional controls via command line instructions, similar to the base -c command shown above. These commands are designed to directly pass JSON instructions written afterwards through the Raspberry Pi's GPIO serial port to the subordinate device. We will further explain the meaning of the default automatic boot-up commands.<br />
<br />
*'''base -c {"T":142,"cmd":50}'''<br />
Sets the extra interval time for the subordinate device to continuously feedback information. The unit for the cmd value is milliseconds. This feature is used to reduce the frequency of feedback information from the slave controller, aiming to alleviate the computational pressure on the control unit from processing this feedback.<br />
*'''base -c {"T":131,"cmd":1}'''<br />
Turns on the continuous information feedback feature of the sub-controller. Once enabled, the control unit does not need to fetch information from the sub-controller in a query-response manner. Although this feature is normally enabled by default on the sub-controller, we send the command again to ensure it's activated.<br />
* '''base -c {"T":143,"cmd":0}'''<br />
Turns off the serial command echo. This way, when the control unit sends instructions to the sub-controller, the latter will not feedback on the received instructions to the control unit, preventing the control unit from processing unnecessary information.<br />
* '''base -c {"T":4,"cmd":2}'''<br />
Sets the type of the external module. A cmd value of 0 indicates no external module is connected; 1 stands for a robotic arm; and 2 for the pan-tilt. If your product does not have a pan-tilt or robotic arm installed, this value should be changed to 0.<br />
* '''base -c {"T":300,"mode":0,"mac":"EF:EF:EF:EF:EF:EF"}'''<br />
Prevents the chassis from being controlled by ESP-NOW broadcasts from other devices, except for devices with the specified MAC address. You can make up a MAC address or use the MAC address of your own ESP32 remote controller.<br />
* '''send -a -b'''<br />
Adds the broadcast address (FF:FF:FF:FF:FF:FF) to peers, enabling you to subsequently send broadcast messages directly to other devices via broadcast signals.<br />
<br />
You can learn about other host computer command line instructions in the following WEB command line application chapters.</div>Eng52https://www.waveshare.com/wiki/08_Sub-controller_JSON_Command_Set08 Sub-controller JSON Command Set2024-03-21T08:36:09Z<p>Eng52: /* CMD_INFO_PRINT */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
In the previous chapter, we introduced a simple demo where we sent motion control commands from the host to the sub-controller. The microcontroller is capable of receiving a wide variety of commands, and in this chapter, we will introduce these commands.<br />
==Composition of JSON Commands==<br />
Taking the command {"T":1,"L":0.2,"R":0.2} sent in the previous chapter as an example, the '''T''' value in this JSON data represents the command type, while the '''L ''' and '''R''' values represent the target linear velocities for the left and right wheels, respectively. The unit for linear velocity is by default meters per second (m/s). In summary, this command is a motion control command where the motion parameters are the target linear velocities for the left and right wheels.<br><br />
All subsequent JSON commands will include a '''T''' value to define the command type, but the specific command parameters will vary depending on the type of command.<br><br />
==JSON Command Set==<br />
You can view the definitions of these commands in the json_cmd.h file of our open-source microcontroller routine, or add new sub-controller functionalities yourself.<br />
===Motion Control Commands===<br />
These commands are fundamental to the mobile robot and are used for motion-related control functions.<br><br />
Each command below includes three parts: an example, a brief introduction, and a detailed description.<br />
====CMD_SPEED_CTRL====<br />
*{"T":1,"L":0.5,"R":0.5}<br />
*Sets the target linear velocity for both wheels (velocity closed-loop control).<br />
L and R represent the target linear velocities for the left and right wheels, respectively, in m/s. Negative values indicate reverse rotation and 0 means stop. The range of target linear velocities depends on the motors/reducers/wheel diameters used in the product, and the relevant calculation formulas can be found in the open-source microcontroller routine. It's important to note that for chassis using brushed DC motors, when the given target velocity's absolute value is very small (but not 0), the motor's poor low-speed performance may cause significant speed fluctuations during movement.<br />
====CMD_PWM_INPUT====<br />
*{"T":11,"L":164,"R":164}<br />
*Sets the PWM value for both drive wheels (velocity open-loop control).<br />
L and R represent the PWM values for the left and right wheels, respectively, with a range of -255 to 255. Negative values indicate the reverse direction and an absolute value of 255 means 100% PWM, indicating full power operation for that wheel.<br />
====CMD_ROS_CTRL====<br />
*{"T":13,"X":0.1,"Z":0.3}<br />
*ROS control (velocity closed-loop control).<br />
This command is for ROS-based host computer control of the chassis movement. X represents the linear velocity in m/s, which can be negative; Z represents the angular velocity in rad/s, which can also be negative.<br />
====CMD_SET_MOTOR_PID====<br />
*{"T":2,"P":20,"I":2000,"D":0,"L":255}<br />
*PID controller settings.<br />
This command is used to tune the PID controller. The PID parameters in the example above are the default parameters for this product. L represents WINDUP_LIMITS, which is an interface currently not used in the product.<br />
====OLED Display Control Commands====<br />
The product comes with an OLED display, which communicates with the ESP32 module of the microcontroller via I2C. The host computer can send JSON commands to change the content displayed on the screen.<br />
<br />
====CMD_OLED_CTRL====<br />
*{"T":3,"lineNum":0,"Text":"putYourTextHere"}<br />
*Controls the display of custom content on the screen.<br />
lineNum is the line number. A single JSON command can change the content of one line. Upon receiving a new command, the default OLED screen that appears at startup will disappear, replaced by your added content. For most products using a 0.91-inch OLED display, lineNum can be 0, 1, 2, or 3, totaling four lines. Text is the content you want to display on this line. If the content is too long for one line, it will automatically wrap to the next line, but this may also push off the last line of content.<br />
<br />
====CMD_OLED_DEFAULT====<br />
*{"T":-3}<br />
* Controls the display to show the default startup screen.<br />
Use this command to revert the OLED display to the default image shown at startup.<br />
<br />
====Module Type====<br />
The mobile chassis can be equipped with different types of modules (none/mechanical arm/gimbal). This command tells the microcontroller about the currently installed module type. This command is usually sent automatically by the host computer to the microcontroller at startup, and more details will be provided in later chapters.<br />
<br />
====CMD_MODULE_TYPE====<br />
*{"T":4,"cmd":0}<br />
* Sets the module type.<br />
The cmd value represents the type of module. Currently, there are three options: 0 for no module, 1 for a robotic arm, and 2 for the pan-tilt.<br />
<br />
====IMU Related Functions====<br />
The chassis is equipped with an IMU sensor. You can use the following commands to obtain data from the IMU sensor. It's important to note that after the product is powered on, the continuous feedback function for chassis information (including IMU data) is enabled by default. The IMU-related functions are only necessary when the continuous feedback function is disabled.<br />
<br />
====CMD_GET_IMU_DATA====<br />
*{"T":126}<br />
*Retrieves IMU data.<br />
This command allows retrieval of data from the IMU sensor upon sending.<br />
<br />
==== CMD_CALI_IMU_STEP====<br />
*{"T":127}<br />
* IMU calibration (reserved interface).<br />
Current product programs do not require calibration; this command is a reserved interface for future use.<br />
<br />
==== CMD_GET_IMU_OFFSET====<br />
* {"T":128}<br />
* Retrieves current IMU offsets (reserved interface).<br />
This command can provide feedback on the offset values for each axis of the current IMU.<br />
<br />
==== CMD_SET_IMU_OFFSET====<br />
* {"T":129,"x":-12,"y":0,"z":0}<br />
* Sets the IMU offsets (reserved interface).<br />
This command allows setting the offset values for each axis of the IMU. It is a reserved command and not required for current products.<br />
<br />
=== Chassis Information Feedback===<br />
<br />
==== CMD_BASE_FEEDBACK====<br />
*{"T":130}<br />
* Chassis information feedback.<br />
After the product is powered on, chassis information feedback is typically enabled by default and occurs automatically. If the continuous feedback function for chassis information is disabled, and there's a need to obtain information about the chassis at a single instance, this command can be used to acquire basic chassis data.<br />
<br />
====CMD_BASE_FEEDBACK_FLOW====<br />
*{"T":131,"cmd":1}<br />
*Continuous chassis information feedback.<br />
Setting the cmd value to 1 enables this function, which is by default activated and continuously provides chassis information. Setting the cmd value to 0 disables this function. Once disabled, the host computer can use the CMD_BASE_FEEDBACK command to obtain chassis information.<br />
<br />
==== CMD_FEEDBACK_FLOW_INTERVAL====<br />
*{"T":142,"cmd":0}<br />
*Sets the interval for continuous feedback.<br />
The cmd value is the interval time to be set, in milliseconds (ms). This command allows adjusting the frequency of chassis feedback information.<br />
<br />
==== CMD_UART_ECHO_MODE====<br />
*{"T":143,"cmd":0}<br />
*Sets the command echo mode.<br />
When the cmd value is set to 0, echo is disabled. When the cmd value is set to 1, echo is enabled, which means the sub-controller will output the commands it receives, facilitating debugging and verification processes.<br />
<br />
===WIFI Configuration===<br />
<br />
====CMD_WIFI_ON_BOOT====<br />
* {"T":401,"cmd":3}<br />
* Set WiFi Mode at Boot.<br />
cmd value 0 turns off WiFi; 1 sets to AP mode; 2 sets to STA mode; 3 sets to AP+STA mode.<br />
<br />
==== CMD_SET_AP====<br />
* {"T":402,"ssid":"UGV","password":"12345678"}<br />
* Configure SSID and Password for AP Mode (ESP32 as a Hotspot).<br />
<br />
====CMD_SET_STA====<br />
*{"T":403,"ssid":"WIFI_NAME","password":"WIFI_PASSWORD"}<br />
* Configure SSID and Password for STA Mode (ESP32 connects to a known hotspot).<br />
<br />
==== CMD_WIFI_APSTA====<br />
*{"T":404,"ap_ssid":"UGV","ap_password":"12345678","sta_ssid":"WIFI_NAME","sta_password":"WIFI_PASSWORD"}<br />
* Set Names and Passwords for AP and STA Modes (AP+STA Mode).<br />
<br />
====CMD_WIFI_INFO====<br />
*{"T":405}<br />
* Get Current WiFi Information.<br />
<br />
==== CMD_WIFI_CONFIG_CREATE_BY_STATUS====<br />
* {"T":406}<br />
* Create a New WiFi Configuration File Using Current Settings.<br />
<br />
====CMD_WIFI_CONFIG_CREATE_BY_INPUT====<br />
* {"T":407,"mode":3,"ap_ssid":"UGV","ap_password":"12345678","sta_ssid":"WIFI_NAME","sta_password":"WIFI_PASSWORD"}<br />
* Create a New WiFi Configuration File Using Input Settings.<br />
<br />
====CMD_WIFI_STOP====<br />
* {"T":408}<br />
* Disconnect WiFi Connection.<br />
<br />
=== 12V Switch and Gimbal Settings===<br />
====CMD_LED_CTRL====<br />
* {"T":132,"IO4":255,"IO5":255}<br />
* 12V Switch Output Settings.<br />
The device's sub-controller board features two 12V switch interfaces, each with two ports, totaling four ports. This command allows you to set the output voltage of these ports. When the value is set to 255, it corresponds to the voltage of a 3S battery. By default, these ports are used to control LED lights, and you can use this command to adjust the brightness of the LEDs.<br />
<br />
====CMD_GIMBAL_CTRL_SIMPLE====<br />
* {"T":133,"X":0,"Y":0,"SPD":0,"ACC":0}<br />
* Basic Gimbal Control Command.<br />
This command is used to control the orientation of the gimbal. X represents the horizontal orientation in degrees, with positive values turning right and negative values turning left, ranging from -180 to 180 degrees. Y represents the vertical orientation in degrees, with positive values tilting up and negative values tilting down, ranging from -30 to 90 degrees. SPD stands for speed, and ACC for acceleration; when set to 0, they indicate the fastest speed/acceleration.<br />
<br />
====CMD_GIMBAL_CTRL_MOVE====<br />
*{"T":134,"X":45,"Y":45,"SX":300,"SY":300}<br />
* Continuous Gimbal Control Command.<br />
This command is for continuous control over the pan-tilt's orientation. X and Y function similarly to the basic control command, specifying the desired horizontal and vertical orientations, respectively. SX and SY represent the speeds for the X and Y axes, respectively.<br />
<br />
==== CMD_GIMBAL_CTRL_STOP====<br />
*{"T":135}<br />
* Pan-tilt Stop Command.<br />
This command can be used to immediately stop the pan-tilt's movement initiated by the previous commands.<br />
==== CMD_GIMBAL_STEADY====<br />
*{"T":137,"s":0,"y":0}<br />
* Pan-tilt Stabilization Feature.<br />
Setting s to 0 turns off this feature, and setting it to 1 enables it. When enabled, the pan-tilt automatically adjusts its vertical angle using IMU data to maintain stability. The y parameter sets the target angle between the pan-tilt and the ground, allowing the camera to look up and down even when the stabilization feature is active.<br />
<br />
==== CMD_GIMBAL_USER_CTRL====<br />
* {"T":141,"X":0,"Y":0,"SPD":300}<br />
* Pan-tilt UI Control.<br />
This command is intended for pan-tilt control via a UI interface. The X value can be -1, 0, or 1, where -1 rotates left, 0 stops, and 1 rotates right. The Y value can also be -1, 0, or 1, where -1 tilts down, 0 stops, and 1 tilts up. SPD specifies the speed of the operation.<br />
<br />
=== Robotic Arm Control===<br />
==== CMD_MOVE_INIT====<br />
*{"T":100}<br />
*Moves the Robotic Arm to Its Initial Position.<br />
Normally, the robotic arm automatically moves to its initial position upon powering up. This command may cause process blocking.<br />
<br />
==== CMD_SINGLE_JOINT_CTRL====<br />
*{"T":101,"joint":0,"rad":0,"spd":0,"acc":10}<br />
* Single Joint Motion Control.<br />
*joint: Joint number. <br />
**1: for BASE_JOINT <br />
**2: for SHOULDER_JOINT<br />
**3: for ELBOW_JOINT<br />
**4: for EOAT_JOINT (wrist/claw joint)<br />
*rad: Angle to rotate to (in radians), based on the initial position of each joint. Default angles and rotation directions for each joint are provided. <br />
**The initial position default angle for BASE_JOINT is 0, with a rotation range of -3.14 to 3.14. When the angle increases, the base joint rotates to the left; when the angle decreases, the base joint rotates to the right.<br />
**The initial position default angle for SHOULDER_JOINT is 0, with a rotation range of -1.57 to 1.57. When the angle increases, the shoulder joint rotates forward; when the angle decreases, the shoulder joint rotates backward.<br />
**The initial position default angle for ELBOW_JOINT is 1.570796, with a rotation range of -1.11 to 3.14. When the angle increases, the elbow joint rotates downward; when the angle decreases, the elbow joint rotates in the opposite direction.<br />
**The initial position default angle for EOAT_JOINT is 3.141593. For the default gripper joint, the rotation range is 1.08 to 3.14, and when the angle decreases, the gripper joint opens. If changed to a wrist joint, the rotation range is 1.08 to 5.20, and when the angle increases, the wrist joint rotates downward; when the angle decreases, the wrist joint rotates upward.<br />
*spd: Rotation speed in steps per second, with one full rotation equaling 4096 steps. A higher value indicates faster rotation; a value of 0 rotates at maximum speed. <br />
*acc: Acceleration at the start and end of the rotation, smoother with lower values, ranging from 0-254 in units of 100 steps/sec². An acc value of 0 signifies maximum acceleration.<br />
<br />
====CMD_JOINTS_RAD_CTRL====<br />
*{"T":102,"base":0,"shoulder":0,"elbow":1.57,"hand":1.57,"spd":0,"acc":10}<br />
* Full Joint Rotation Control in Radians.<br />
*base: The angle of the base joint, with rotation range as described in the "CMD_SINGLE_JOINT_CTRL" command above in the "rad" key.<br />
*shoulder: The angle of the shoulder joint.<br />
*elbow: The angle of the elbow joint.<br />
*hand: The angle of the gripper/wrist joint.<br />
*spd: The speed of rotation, measured in steps per second. A servo motor completes one full rotation in 4096 steps. A higher numerical value corresponds to a faster speed. When the speed value is 0, rotation occurs at maximum speed.<br />
*acc: The acceleration at the start and end of rotation. A smaller value results in smoother acceleration and deceleration. The value can range from 0 to 254, measured in 100 steps per second squared. For example, setting it to 10 would mean acceleration and deceleration occur at 1000 steps per second squared. When the acceleration value is 0, the maximum acceleration is used.<br />
<br />
==== CMD_SINGLE_AXIS_CTRL====<br />
* {"T":103,"axis":2,"pos":0,"spd":0.25}<br />
* Single Axis Coordinate Control.<br />
*axis specifies the axis: 1-x, 2-y, 3-z, 4-t, with units in mm for all but the T axis, which is in radians. spd is the speed coefficient, with higher values indicating faster movement.<br />
<br />
==== CMD_XYZT_GOAL_CTRL====<br />
* {"T":104,"x":235,"y":0,"z":234,"t":3.14,"spd":0.25}<br />
* Robotic Arm Coordinate Motion Control (Inverse Kinematics).<br />
This function causes blocking.<br />
<br />
====CMD_XYZT_DIRECT_CTRL====<br />
*{"T":1041,"x":235,"y":0,"z":234,"t":3.14}<br />
* Robotic Arm Coordinate Motion Control (Inverse Kinematics).<br />
This function does not cause blocking.<br />
<br />
==== CMD_SERVO_RAD_FEEDBACK====<br />
*{"T":105}<br />
* Provides feedback on the robotic arm's coordinates.<br />
<br />
====CMD_EOAT_HAND_CTRL====<br />
*{"T":106,"cmd":1.57,"spd":0,"acc":0}<br />
*End-of-Arm Tool Control in Radians.<br />
*cmd: Angle to rotate to (in radians). The initial position default angle for EOAT_JOINT is 3.141593.<br />
**For the default clamp joint, the rotation range is from 1.08 to 3.14, and when the angle decreases, the clamp joint opens.<br />
**If changed to a wrist joint, the rotation range is from 1.08 to 5.20, and when the angle increases, the wrist joint rotates downward; when the angle decreases, the wrist joint rotates upward.<br />
*spd: The speed of rotation, measured in steps per second. A servo motor completes one full rotation in 4096 steps. A higher numerical value corresponds to a faster speed. When the speed value is 0, rotation occurs at maximum speed.<br />
*acc: The acceleration at the start and end of rotation. A smaller numerical value results in smoother acceleration and deceleration. The value can range from 0 to 254, measured in 100 steps per second squared. For example, setting it to 10 would mean acceleration and deceleration occur at 1000 steps per second squared. When the acceleration value is 0, the maximum acceleration is used.<br />
<br />
====CMD_EOAT_GRAB_TORQUE====<br />
*{"T":107,"tor":200}<br />
* Clamp Force Control.<br />
The tor value can go up to 1000, representing 100% force.<br />
<br />
====CMD_SET_JOINT_PID====<br />
*{"T":108,"joint":3,"p":16,"i":0}<br />
*Joint PID Settings.<br />
<br />
====CMD_RESET_PID====<br />
*{"T":109}<br />
* Resets Joint PID Settings.<br />
<br />
==== CMD_SET_NEW_X====<br />
* {"T":110,"xAxisAngle":0}<br />
* Sets a New Direction for the X Axis.<br />
<br />
====CMD_DYNAMIC_ADAPTATION====<br />
*{"T":112,"mode":0,"b":1000,"s":1000,"e":1000,"h":1000}<br />
* Dynamic External Force Adaptation Control.<br />
<br />
=== Other Settings===<br />
<br />
==== CMD_HEART_BEAT_SET====<br />
*{"T":136,"cmd":3000}<br />
* Sets the Heartbeat Function Interval.<br />
The cmd unit is milliseconds. This command sets the interval for the heartbeat function. If the sub-controller does not receive a new motion command within this time, it will automatically stop movement. This feature helps prevent continuous movement in case the host crashes.<br />
<br />
====CMD_SET_SPD_RATE====<br />
*{"T":138,"L":1,"R":1}<br />
*Sets the Speed Ratio for Left and Right.<br />
Due to potential errors in encoders or tire traction, the device might not move straight even when both wheels are set to the same speed. This command allows for fine-tuning the speed of the left and right wheels to correct this issue. For example, if the left wheel needs to rotate slower, you can change the value of L to 0.98. Try not to set the values of L and R greater than one.<br />
<br />
====CMD_GET_SPD_RATE====<br />
*{"T":139}<br />
*Retrieves the Current Speed Ratio.<br />
This command fetches the current speed ratio settings.<br />
<br />
<br />
===ESP-NOW Related Settings===<br />
<br />
==== CMD_BROADCAST_FOLLOWER====<br />
*{"T":300,"mode":1}<br />
*{"T":300,"mode":0,"mac":"CC:DB:A7:5B:E4:1C"}<br />
*Sets the mode for ESP-NOW broadcast control.<br />
When mode is 1, other devices can control it via broadcast commands; when mode is 0, only devices with the specified MAC address can control it.<br />
<br />
==== CMD_GET_MAC_ADDRESS====<br />
*{"T":302}<br />
*Retrieves the current device's MAC address.<br />
<br />
==== CMD_ESP_NOW_ADD_FOLLOWER====<br />
*{"T":303,"mac":"FF:FF:FF:FF:FF:FF"}<br />
* Adds a MAC address to the controlled device (PEER).<br />
<br />
==== CMD_ESP_NOW_REMOVE_FOLLOWER====<br />
*{"T":304,"mac":"FF:FF:FF:FF:FF:FF"}<br />
* Removes a MAC address from the PEER.<br />
<br />
====CMD_ESP_NOW_GROUP_CTRL====<br />
*{"T":305,"dev":0,"b":0,"s":0,"e":1.57,"h":1.57,"cmd":0,"megs":"hello!"}<br />
*ESP-NOW group control.<br />
<br />
====CMD_ESP_NOW_SINGLE====<br />
* {"T":306,"mac":"FF:FF:FF:FF:FF:FF","dev":0,"b":0,"s":0,"e":1.57,"h":1.57,"cmd":0,"megs":"hello!"}<br />
* ESP-NOW unicast/group control.<br />
<br />
=== Task File Related Functions===<br />
<br />
This functionality belongs to the advanced features of the microcontroller and is usually not required when using the host controller.<br />
<br />
==== CMD_SCAN_FILES==== <br />
* {"T":200}<br />
*Scans the current task files.<br />
<br />
==== CMD_CREATE_FILE==== <br />
*{"T":201,"name":"file.txt","content":"inputContentHere."}<br />
* Creates a new task file.<br />
<br />
==== CMD_READ_FILE==== <br />
*{"T":202,"name":"file.txt"}<br />
*Reads a task file.<br />
<br />
==== CMD_DELETE_FILE==== <br />
*{"T":203,"name":"file.txt"}<br />
* Deletes a task file.<br />
<br />
==== CMD_APPEND_LINE==== <br />
* {"T":204,"name":"file.txt","content":"inputContentHere."}<br />
* Adds a new instruction at the end of a task file.<br />
<br />
==== CMD_INSERT_LINE==== <br />
* {"T":205,"name":"file.txt","lineNum":3,"content":"content"}<br />
* Insert a new instruction in the middle of a task file.<br />
<br />
==== CMD_REPLACE_LINE==== <br />
* {"T":206,"name":"file.txt","lineNum":3,"content":"Content"}<br />
* Replaces an instruction in a task file.<br />
<br />
=== Servo Settings===<br />
<br />
==== CMD_SET_SERVO_ID====<br />
* {"T":501,"raw":1,"new":11}<br />
* Changes the servo ID.<br />
raw is the servo's original ID (new servos are all set to 1), and the new one is the ID to be changed to, which must not exceed 254, cannot be negative, and 255 is reserved as the broadcast ID.<br />
<br />
==== CMD_SET_MIDDLE====<br />
* {"T":502,"id":11}<br />
* Sets the current position of the servo as the middle position (only valid for ST series servos).<br />
<br />
====CMD_SET_SERVO_PID====<br />
* {"T":503,"id":14,"p":16}<br />
* Sets the P value of the servo's PID.<br />
<br />
===ESP32 Related Features===<br />
<br />
==== CMD_REBOOT====<br />
* {"T":600}<br />
* Reboot the ESP32.<br />
<br />
==== CMD_FREE_FLASH_SPACE====<br />
* {"T":601}<br />
* Retrieves the remaining space size in the FLASH memory.<br />
<br />
==== CMD_BOOT_MISSION_INFO====<br />
* {"T":602}<br />
* Outputs the current boot mission file.<br />
<br />
==== CMD_RESET_BOOT_MISSION====<br />
* {"T":603}<br />
* Resets the boot mission file to its default or a predetermined state.<br />
<br />
==== CMD_NVS_CLEAR====<br />
* {"T":604}<br />
* Clears the ESP32's Non-Volatile Storage (NVS) area. This command can be useful if there are issues with establishing a WiFi connection. It's recommended to reboot the ESP32 after executing this command.<br />
<br />
==== CMD_INFO_PRINT====<br />
*{"T":605,"cmd":1}<br />
*Sets the mode for information feedback.<br />
*When cmd is set to 1, it enables the printing of debug information. Setting cmd to 2 enables continuous feedback of chassis information. Setting cmd to 0 turns off feedback, meaning no information will be provided.</div>Eng52https://www.waveshare.com/wiki/07_Controlling_Slave_Devices_with_JSON_Commands07 Controlling Slave Devices with JSON Commands2024-03-21T08:12:40Z<p>Eng52: /* A Simple Example of Controlling a Subordinate Device with JSON Commands */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This product is developed using a brain-like architecture, where the main control unit communicates with the slave device via serial ports (Raspberry Pi through GPIO serial ports). It's important to note that this chapter serves as a precursor to a more detailed exploration of the JSON command set for the slave device, and its content is similar to the previous chapter on Python chassis movement control. If you're already familiar with that chapter, you might find this overview sufficient.<br />
==Advantages of JSON data format==<br />
JSON (JavaScript Object Notation) is a lightweight data-interchange format that has become one of the standards for data transmission on the internet. Here are some advantages of JSON:<br />
*High Readability: JSON uses a text format that is easy for humans to read and write. It organizes data in key-value pairs, making it more readable and understandable during transmission and storage.<br />
*Lightweight: JSON syntax is more concise and compact compared to other data formats like XML, making it more lightweight. This reduces the size of data transmissions and network bandwidth usage, improving transmission efficiency.<br />
*Ease of Parsing: The simple and clear data structure of JSON makes it easy to parse and serialize. Nearly all programming languages offer libraries for parsing and generating JSON, allowing developers to easily work with JSON data.<br />
*Wide Language Compatibility: JSON is supported in almost all programming languages, enabling convenient data exchange and communication across different platforms and systems.<br />
*Support for Multiple Data Types: JSON supports a variety of data types, including strings, numbers, boolean values, arrays, and objects. This flexibility allows it to represent a wide range of data structures.<br />
*Seamless Integration with Web Technologies: JSON originated from JavaScript, making its integration with web technologies very tight. It is highly compatible with the JavaScript language, making it convenient to use in web applications.<br />
== A Simple Example of Controlling a Subordinate Device with JSON Commands==<br />
In the following example, we use the is_raspberry_pi5() function to determine the current model of the Raspberry Pi because the GPIO serial port device names differ between Raspberry Pi 4B and Raspberry Pi 5. It's crucial to use the correct GPIO device name and the same baud rate as the subordinate device (default is 115200).<br><br />
Before executing the code block below, ensure the robot is elevated so that the drive wheels are off the ground. Activating the code will cause the robot to move; take precautions to prevent it from falling off the table.<br />
<br />
<syntaxhighlight lang="python"><br />
from base_ctrl import BaseController<br />
import time<br />
<br />
# Function for Detecting Raspberry Pi<br />
def is_raspberry_pi5():<br />
with open('/proc/cpuinfo', 'r') as file:<br />
for line in file:<br />
if 'Model' in line:<br />
if 'Raspberry Pi 5' in line:<br />
return True<br />
else:<br />
return False<br />
<br />
# Determine the GPIO Serial Device Name Based on the Raspberry Pi Model<br />
if is_raspberry_pi5():<br />
base = BaseController('/dev/ttyAMA0', 115200)<br />
else:<br />
base = BaseController('/dev/serial0', 115200)<br />
<br />
# The wheel rotates at a speed of 0.2 meters per second and stops after 2 seconds.<br />
base.send_command({"T":1,"L":0.2,"R":0.2})<br />
time.sleep(2)<br />
base.send_command({"T":1,"L":0,"R":0})<br />
</syntaxhighlight><br />
By invoking the code block mentioned above, the Raspberry Pi will initially send the command {"T":1,"L":0.2,"R":0.2} (the structure of commands will be discussed in more detail in later chapters). This command starts the wheels turning. After a two-second interval, the Raspberry Pi will send another command {"T":1,"L":0,"R":0}, causing the wheels to stop. It's important to note that even if the command to stop the wheels isn't sent, the wheels will still cease turning if no new commands are issued. This is due to a heartbeat function built into the sub-controller. The purpose of this heartbeat function is to halt the current motion command automatically if the host control hasn't sent any new commands to the sub-controller for an extended period. This function is designed to prevent continuous motion of the sub-controller in case the host encounters a problem that leads to a freeze or crash.<br><br />
<br />
If you want the robot to continue moving indefinitely, the host control needs to cyclically send motion control commands every 2 to 4 seconds.</div>Eng52https://www.waveshare.com/wiki/06_Retrieving_Chassis_Feedback_Information06 Retrieving Chassis Feedback Information2024-03-21T08:03:18Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
=How to Retrievie Chassis Feedback Information=<br />
Upon startup, the lower-level systems of the product continuously send various types of feedback information to the upper-level systems. This feedback can be utilized to ascertain the current operational status of the product.<br><br />
<br />
Typically, you would continuously monitor the feedback information from the lower-level systems. However, in this example, we will retrieve a single piece of JSON feedback information from the slave systems (you can continuously receive feedback information by commenting out or deleting the break line).<br><br />
<br />
Select the code block below and run it using Ctrl + Enter. The loop will exit and display the feedback information after receiving the first complete JSON message with a "T" value of 1001. The feedback information includes the current wheel speed, IMU data, gimbal angle (if installed), arm angle (if installed), power voltage, and other data.<br />
<br />
<syntaxhighlight lang="python"><br />
from base_ctrl import BaseController<br />
import json<br />
<br />
# Function for Detecting Raspberry Pi<br />
def is_raspberry_pi5():<br />
with open('/proc/cpuinfo', 'r') as file:<br />
for line in file:<br />
if 'Model' in line:<br />
if 'Raspberry Pi 5' in line:<br />
return True<br />
else:<br />
return False<br />
<br />
# Determine the GPIO Serial Device Name Based on the Raspberry Pi Model<br />
if is_raspberry_pi5():<br />
base = BaseController('/dev/ttyAMA0', 115200)<br />
else:<br />
base = BaseController('/dev/serial0', 115200)<br />
<br />
# Implement an infinite loop to continuously monitor serial port data.<br />
while True:<br />
try:<br />
# Read a line of data from the serial port, decode it into a 'utf-8' formatted string, and attempt to convert it into a JSON object.<br />
data_recv_buffer = json.loads(base.rl.readline().decode('utf-8'))<br />
# Check if the parsed data contains the key 'T'.<br />
if 'T' in data_recv_buffer:<br />
# If the value of 'T' is 1001, print the received data and break out of the loop.<br />
if data_recv_buffer['T'] == 1001:<br />
print(data_recv_buffer)<br />
break<br />
# If an exception occurs while reading or processing the data, ignore the exception and continue to listen for the next line of data.<br />
except:<br />
pass<br />
</syntaxhighlight><br />
<br />
==Non-Blocking Method for Receiving JSON Information via Serial Port==<br />
The following code is intended for understanding the underlying principles of reading JSON information from a serial port and should not be executed.<br />
<syntaxhighlight lang="python"><br />
class ReadLine:<br />
# Construct a constructor to initialize an instance of the ReadLine class.<br />
# s: The serial port object passed in for communication with the serial port.<br />
def __init__(self, s):<br />
self.buf = bytearray() # Initialize a byte array to store data that has been read from the serial port but not yet processed.<br />
self.s = s # Store the passed-in serial port object for subsequent use in reading serial port data.<br />
<br />
def readline(self):<br />
i = self.buf.find(b"\n") # Check if there is a newline character in the buffer.<br />
if i >= 0:<br />
r = self.buf[:i+1] # If there is a newline character, extract the data before the newline character.<br />
self.buf = self.buf[i+1:] # Update the buffer by removing the data that has been processed.<br />
return r<br />
while True:<br />
i = max(1, min(512, self.s.in_waiting)) # Retrieve the number of bytes available for reading, up to a maximum of 512 bytes.<br />
data = self.s.read(i) # Read data from the serial port.<br />
i = data.find(b"\n") # Search for a newline character.<br />
if i >= 0:<br />
r = self.buf + data[:i+1] # If a newline character is found, merge the data that has been read with the data in the buffer.<br />
self.buf[0:] = data[i+1:] # Update the buffer by removing the processed data.<br />
return r<br />
else:<br />
self.buf.extend(data) # If a newline character is not found, add the data to the buffer.<br />
</syntaxhighlight><br />
<br />
*This method is designed for reading data from a serial port and returning a complete line of JSON data, delimited by a newline character (\n).<br />
*If there's already a complete line of data in the buffer, it returns that line directly.<br />
*If the buffer does not contain a complete line, the method uses the in_waiting function to check the number of bytes available for reading in the serial port's buffer, reading up to 512 bytes at a time.<br />
*The data read from the serial port is then merged with the data in the buffer.<br />
*The method checks the newly read data for a newline character. If found, it extracts the complete line of data and updates the buffer.<br />
*If no newline character is found, the new data is added to the buffer, and the method continues reading until a newline character is found.<br />
===Function Characteristics===<br />
* Non-blocking: This function employs a non-blocking reading approach, meaning it does not halt the program's execution even if there's no data to read from the serial port. It waits until data is available.<br />
* Efficient: By using a small buffer and limiting the read amount to a maximum of 512 bytes at a time, the function reduces memory consumption and promptly processes data to prevent buffer overflow.<br />
* Flexible: The function can flexibly read data of any length and automatically deals with data split by newline characters. This makes it particularly suitable for reading structured data like JSON.<br />
* Reliable: The function is designed to account for various scenarios, such as insufficient buffer data or the absence of newline characters in the read data, ensuring the accuracy and stability of the read operations.<br />
* Ideal for Real-time Use: This function is especially suitable for real-time reading of JSON data from a serial port, especially in scenarios where non-blocking reads are required.</div>Eng52https://www.waveshare.com/wiki/05_Building_UI_Interfaces_in_JupyterLab05 Building UI Interfaces in JupyterLab2024-03-21T07:47:17Z<p>Eng52: Created page with "<div class="wiki-pages jet-green-color"> In JupyterLab, constructing UI interfaces is commonly achieved using the "ipywidgets" library, offering a simple yet powerful method t..."</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
In JupyterLab, constructing UI interfaces is commonly achieved using the "ipywidgets" library, offering a simple yet powerful method to create interactive user interfaces. Here are the detailed steps:<br />
==Importing Required Libraries==<br />
The '''ipywidgets''' library is pre-installed in our product. If you encounter a '''library not found''' error when executing code blocks, you can install the necessary libraries for the UI interface by running pip install ipywidgets.<br />
<br />
Select the following code block and press '''Ctrl + Enter''' to execute the code.<br />
<syntaxhighlight lang="python"><br />
import ipywidgets as widgets<br />
from IPython.display import display<br />
</syntaxhighlight><br />
==Creating UI Components==<br />
We can use various components from the '''ipywidgets''' library to build our UI interface, such as text boxes, buttons, output boxes, etc. For example:<br />
<syntaxhighlight lang="python"><br />
# Creating a text box<br />
text = widgets.Text(description='Input Name:')<br />
<br />
# Creating a button<br />
button = widgets.Button(description="Say Hi")<br />
<br />
# Creating an output box<br />
output = widgets.Output()<br />
</syntaxhighlight><br />
<br />
==Defining Event Handling Functions==<br />
We need to define a function to handle user interaction events. In this example, we'll define a function to handle the button on click events and display a greeting in the output box.<br />
<syntaxhighlight lang="python"><br />
# Define a function `greet_user` that takes one parameter, `sender`, where `sender` represents the object that triggered the event, such as a button.<br />
def greet_user(sender):<br />
# Use the `with` statement and the `output` object to capture the output of the `print` function, so it displays in the expected output area.<br />
# `output` is a pre-defined output object.<br />
with output:<br />
# Use the `print` function to output a greeting message, where the `format` method is used to insert the current value of the `text` widget into the string.<br />
# "{}" serves as a placeholder that the `format` function will replace with the value of `text.value`.<br />
print("Hi,{}".format(text.value))<br />
<br />
# Use the `on_click` method to link the button's click event with the `greet_user` function.<br />
# When the user clicks this button, the `greet_user` function will be called.<br />
button.on_click(greet_user)<br />
</syntaxhighlight><br />
<br />
== Displaying the UI==<br />
Finally, we place all UI components in a layout and display them using the display function.<br />
<syntaxhighlight lang="python"><br />
# Place all components in a vertical layout<br />
ui = widgets.VBox([text, button, output])<br />
<br />
# Display the UI<br />
display(ui)<br />
</syntaxhighlight><br />
<br />
By following these steps, we can build a simple UI interface in JupyterLab. Users can enter content in the text box, and upon clicking the button, the program will display a corresponding greeting in the output box based on the input content.</div>Eng52https://www.waveshare.com/wiki/04_OLED_Screen_Control04 OLED Screen Control2024-03-21T07:40:27Z<p>Eng52: /* Displaying Dynamic Information on OLED */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
This tutorial demonstrates how to control an OLED display connected to an ESP32 module using JSON commands. OLED displays are widely used for showing various types of information, such as text and images.<br />
<br />
== OLED Screens Basics== <br />
OLED displays communicate with the ESP32 module via the I2C (Inter-Integrated Circuit) interface. These displays are capable of showing custom text content and support multi-line display.<br />
<br />
<br />
The product comes with an OLED display that communicates with the ESP32 module through the I2C interface. Upon powering up, the display automatically shows some basic information about the device. The content displayed on the screen can be altered by sending JSON commands from the host device.<br />
<br />
===OLED Screen Control JSON Commands===<br />
*{"T":3,"lineNum":0,"Text":"putYourTextHere"}<br />
**Controls the display to show custom content.<br />
**'''lineNum''' refers to the line number, and a single JSON command can change the content of one line. For the commonly used 0.91-inch OLED displays, the value of lineNum can be 0, 1, 2, or 3, allowing for four lines in total.<br><br />
**'''Text''' is the content you wish to display on that line. If the content exceeds the line length, it will automatically wrap to the next line, potentially overwriting the last line's content.<br />
lineNum refers to the line number. A single JSON command can modify the content of one line. When the subordinate machine receives a new command, the default OLED display screen at startup will disappear, replaced by the content you've added. For most products that use a 0.91-inch OLED display, the value of lineNum can be 0, 1, 2, or 3, allowing for a total of four lines. Text is the textual content you wish to display on that line. If the content for a line is too long, it will automatically wrap to the next line, potentially overwriting the content on the last line.<br />
<br />
<syntaxhighlight lang="python"><br />
from base_ctrl import BaseController<br />
<br />
# Function for Detecting Raspberry Pi<br />
def is_raspberry_pi5():<br />
with open('/proc/cpuinfo', 'r') as file:<br />
for line in file:<br />
if 'Model' in line:<br />
if 'Raspberry Pi 5' in line:<br />
return True<br />
else:<br />
return False<br />
<br />
# Determine the GPIO Serial Device Name Based on the Raspberry Pi Model<br />
if is_raspberry_pi5():<br />
base = BaseController('/dev/ttyAMA0', 115200)<br />
else:<br />
base = BaseController('/dev/serial0', 115200)<br />
<br />
# Modifying the Display Content on the OLED Screen<br />
base.send_command({"T":3,"lineNum":0,"Text":"this is line0"})<br />
base.send_command({"T":3,"lineNum":1,"Text":"this is line1"})<br />
base.send_command({"T":3,"lineNum":2,"Text":"this is line2"})<br />
base.send_command({"T":3,"lineNum":3,"Text":"this is line3"})<br />
<br />
</syntaxhighlight><br />
<br />
Running the provided code block will display four lines of text on the OLED:<br />
<pre><br />
this is line0<br />
<br />
this is line1<br />
<br />
this is line2<br />
<br />
this is line3<br />
</pre><br />
<br />
==Displaying Dynamic Information on OLED ==<br />
The tutorial above outlined a method for displaying simple text on the OLED screen. We will now proceed with a slightly more complex example. Running the following code block will display the current time on the OLED screen. Note that the time displayed might not be accurate due to potential discrepancies with the Raspberry Pi's clock. This example serves to demonstrate how to update the screen content in the main program, where we employ this method to display real-time information such as the device's IP address and operational status on the OLED screen.<br />
<syntaxhighlight lang="python"><br />
# Import the datetime class from the datetime module to fetch and manipulate the current date and time.<br />
from datetime import datetime<br />
# Import the time module, primarily used for delay processing within the program.<br />
import time<br />
<br />
# Create an infinite loop using while True to allow the program to run continuously.<br />
while True:<br />
# Use datetime.now().strftime("%H:%M:%S") to obtain the current time and format it as "hour:minute:second".<br />
current_time = datetime.now().strftime("%H:%M:%S")<br />
# Utilize the base.send_command method to send a command that includes the current time.<br />
base.send_command({"T":3,"lineNum":0,"Text":current_time})<br />
# Use time.sleep(1) to pause the program for 1 second, ensuring that the time is updated and a command is sent every second.<br />
time.sleep(1)<br />
</syntaxhighlight><br />
Running the last code block, you'll observe the first line of the OLED screen updating to show the current time, refreshing every second. This function runs in an infinite loop, which can be terminated by clicking the stop button(■) above.</div>Eng52https://www.waveshare.com/wiki/03_Pan-tilt_And_LED_Control03 Pan-tilt And LED Control2024-03-21T07:30:02Z<p>Eng52: /* Introduction */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
='''Pan-tilt Control And LED Control '''=<br />
==Pan-tilt Control == <br />
=== Introduction===<br />
The product (PT Version only) with pan-tilt has two servos, that is, pan servo and tilt servo, and the rotation range is ±180° (360° in total). The tilt servo controls the horizontal rotation, and the range is -45°~90° (135° in total)<br><br />
For products without a pan-tilt mechanism, users can expand this function by adding a pan-tilt mechanism to the RaspRover platform themselves. <br />
<syntaxhighlight lang="python"><br />
# Input the library for base control<br />
from base_ctrl import BaseController<br />
<br />
# Function for Detecting Raspberry Pi<br />
def is_raspberry_pi5():<br />
with open('/proc/cpuinfo', 'r') as file:<br />
for line in file:<br />
if 'Model' in line:<br />
if 'Raspberry Pi 5' in line:<br />
return True<br />
else:<br />
return False<br />
<br />
# Determine the GPIO Serial Device Name Based on the Raspberry Pi Model<br />
if is_raspberry_pi5():<br />
base = BaseController('/dev/ttyAMA0', 115200)<br />
else:<br />
base = BaseController('/dev/serial0', 115200)<br />
</syntaxhighlight><br />
In the code block above, we import and instantiate the library for base control. Next, we control the movement of the pan-tilt by changing the angles of the pan and tilt servos.<br><br />
Modify the values in the following code:<br><br />
* input_x: Angle of the pan servo, within the range of ±180° (total range of 360°).<br />
* input_y: Angle of the tilt servo, within the range of -45° to 90° (total range of 135°). <br />
* input_speed: Speed of the pan-tilt movement. When the speed value is 0, the movement is at its fastest. <br />
* input_acc: Acceleration at the start and end of the pan-tilt movement. Smaller values result in smoother acceleration and deceleration. When the acceleration value is 0, the maximum acceleration is applied.<br />
Run the code below to observe the pan-tilt move 45° to the right and upward before stopping.<br />
<syntaxhighlight lang="python"><br />
input_x = 45<br />
input_y = 45<br />
input_speed = 0<br />
input_acc = 0<br />
<br />
base.gimbal_ctrl(input_x, input_y, input_speed, input_acc)<br />
</syntaxhighlight><br />
<br />
In addition to controlling the pan-tilt movement by changing the angles of the two servos, you can also directly control its continuous movement.<br><br />
Modify the values in the code below:<br><br />
* input_x: the rotation mode of the pan servo. When the value is -1, it represents continuous rotation to the left (clockwise); when the value is 1, it represents continuous rotation to the right (counterclockwise); when the value is 0, it indicates stopping the rotation.<br />
* input_y: the rotation mode of the tilt servo. When the value is -1, it represents continuous downward rotation (tilting down); when the value is 1, it represents continuous upward rotation (tilting up); when the value is 0, it indicates stopping the rotation.<br />
* input_speed: speed of the pan-tilt movement. <br />
If both input_x and input_y are set to 2, the pan-tilt will automatically return to its middle position.<br><br />
Run the code below, and the pan-tilt will move to the left until it reaches 180° and then stop.<br />
<syntaxhighlight lang="python"><br />
input_x = -1<br />
input_y = 0<br />
input_speed = 0<br />
<br />
base.gimbal_base_ctrl(input_x, input_y, input_speed)<br />
</syntaxhighlight><br />
Run the following code, the pan-tilt will move upward until it reaches 90° and then stop. <br />
<syntaxhighlight lang="python"><br />
input_x = 0<br />
input_y = 1<br />
input_speed = 0<br />
<br />
base.gimbal_base_ctrl(input_x, input_y, input_speed)<br />
</syntaxhighlight><br />
To make the pan-tilt return to its middle position, run the following code:<br />
<syntaxhighlight lang="python"><br />
input_x = 2<br />
input_y = 2<br />
input_speed = 0<br />
<br />
base.gimbal_base_ctrl(input_x, input_y, input_speed)<br />
</syntaxhighlight><br />
<br />
==LED Control ==<br />
=== Introduction ===<br />
The WAVE ROVER and UGV series products feature a drive board integrated with two 12V switches (the actual maximum voltage varies with the battery voltage). These switches are controlled by ESP32's IO4 and IO5 pins via MOS tubes. Each switch has two ports, totaling 4x 12V switch control ports. By default, IO4 controls the chassis LED (WAVE ROVER does not have a chassis LED), and IO5 controls the LED. You can control the switching of these two switches and adjust the voltage level by sending the corresponding commands to the sub-controller. However, due to the inherent delay in MOSFET control, there may not be a linear relationship between the PWM output from the ESP32's IO and the actual voltage output.<br><br />
<br />
<br />
For products without LEDs, you can expand the 12.6V withstand LED on these two 12V switches (in general, 12V withstand is also acceptable for safety and battery protection, the product's UPS will not charge the battery above 12V). You can also expand other peripherals on the remaining switch control interfaces, such as a 12V withstand water gun gearbox, which can be directly connected to the interface controlled by IO5 to achieve automatic aiming and shooting functionality.<br><br />
<br />
<br />
To run the code within the code block, you can select the desired code block and then press Ctrl+Enter to run the program. If you are using JupyterLab on a mobile device or tablet, you can press the play button above the code block to run it.<br />
<br />
<br />
In the above code block, we import and instantiate the library for controlling the chassis. Next, we control the switch of the IO4 interface. The variable IO4_PWM represents the PWM output of the ESP32's IO4 pin, with a range of 0-255. When this variable is set to 0, the switch controlled by IO4 is turned off; when set to 255, the voltage output of the switch controlled by IO4 approaches the battery voltage of the UPS (the current voltage of the three lithium batteries in series inside the UPS).<br />
<br />
Run the following code block to turn on the switch controlled by IO4 (turn on the chassis headlights). Note: WAVE ROVE does not have chassis headlights, so running the following code block will not make any changes. You need to run the next code block to turn on the headlights, which are located on the camera gimbal. If the product is not equipped with a camera gimbal, there are no headlights.<br />
<br />
<syntaxhighlight lang="python"><br />
IO4_PWM = 255<br />
IO5_PWM = 0<br />
<br />
base.lights_ctrl(IO4_PWM, IO5_PWM)<br />
</syntaxhighlight><br />
To turn on the switch controlled by interface IO5 (turn on the pan-tilt LED), run the following code block:<br><br />
Note: If the product does not come with a camera pan-tilt, there are no LED lights.<br />
<syntaxhighlight lang="python"><br />
IO4_PWM = 255<br />
IO5_PWM = 255<br />
<br />
base.lights_ctrl(IO4_PWM, IO5_PWM)<br />
</syntaxhighlight><br />
<br />
If your product comes with LED, they should all be tilted up by now. You can run the following code block to decrease the brightness of the LED lights.<br />
<syntaxhighlight lang="python"><br />
IO4_PWM = 64<br />
IO5_PWM = 64<br />
<br />
base.lights_ctrl(IO4_PWM, IO5_PWM)<br />
</syntaxhighlight><br />
Finally, run the following code block to turn off the LEDs.<br />
<syntaxhighlight lang="python"><br />
base.lights_ctrl(0, 0)<br />
</syntaxhighlight><br />
Run a code block here that integrates the pan-tilt functionality:<br />
<syntaxhighlight lang="python"><br />
import time<br />
base.gimbal_ctrl(0, 0, 0, 0)<br />
base.lights_ctrl(255, 255)<br />
time.sleep(0.3)<br />
base.gimbal_ctrl(45, 0, 0, 0)<br />
base.lights_ctrl(0, 0)<br />
time.sleep(0.3)<br />
base.gimbal_ctrl(-45, 0, 0, 0)<br />
base.lights_ctrl(255, 255)<br />
time.sleep(0.3)<br />
base.gimbal_ctrl(0, 90, 0, 0)<br />
base.lights_ctrl(0, 0)<br />
time.sleep(0.3)<br />
base.gimbal_ctrl(0, 0, 0, 0)<br />
</syntaxhighlight></div>Eng52https://www.waveshare.com/wiki/02_Python_Chassis_Motion_Control02 Python Chassis Motion Control2024-03-21T07:17:53Z<p>Eng52: /* Chassis Control Demo */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
='''Python Chassis Motion Control'''=<br />
<br />
In this chapter, we'll provide a Python demo for controlling the motion of a robot's chassis. This approach can be adapted to other programming languages for similar motion control tasks.<br />
==Control Mechanism Overview== <br />
We utilize code blocks within JupyterLab to compose JSON commands. These commands are then dispatched to the microcontroller via the GPIO serial port on a Raspberry Pi (the default baud rate for communication with the sub-controller is 115200). Upon receiving these commands, the microcontroller executes the specified actions.<br><br />
<br />
Further sections will delve into the variety of commands that can be sent to the sub-controller. Alternatively, you might choose to implement these functionalities in a different programming language or develop a dedicated application for the controlling system.<br />
<br />
== Design Advantages== <br />
Adopting a dual-controller architecture significantly liberates the resources of the master device. In this setup, the host controller (such as a Raspberry Pi or Jetson SBC) acts as the "brain", while the ESP32 sub-controller serves as a "cerebellum-like" entity handling the lower-level motion controls. This division of tasks allows the host to focus on high-level tasks like vision processing and decision-making, while the sub-controller efficiently manages precise movement control and other low-level tasks. Such an arrangement ensures that the sub-controller can maintain accurate wheel rotation speeds through high-frequency PID control, without overburdening the host with computationally intensive tasks.<br />
<br />
== Main Program (app.py)==<br />
The main program of the product, app.py, automatically executes upon booting due to the configuration set by autorun.sh (which is pre-configured in the product). Running app.py occupies the GPIO serial port and the camera, which might lead to conflicts or errors if other tutorials or programs require these resources. Ensure to disable the auto-start of app.py before proceeding with development or learning.<br><br />
<br />
As app.py employs multithreading and uses crontab for autostart, typical commands like sudo killall python won't suffice to terminate it. You would need to comment out the relevant line in crontab and reboot the device.<br />
crontab -e<br />
Upon your first use of this command, you will be prompted to select an editor to open the crontab file. It is recommended to choose "nano" by entering the corresponding number for nano, followed by pressing the Enter key to confirm.<br><br />
Use the "#" symbol to comment out the line containing "...... app.py".<br />
<br />
<syntaxhighlight lang="python"><br />
# @reboot ~/ugv_pt_rpi/ugv-env/bin/python ~/ugv_pt_rpi/app.py >> ~/ugv.log 2>&1<br />
@reboot /bin/bash ~/ugv_pt_rpi/start_jupyter.sh >> ~/jupyter_log.log 2>&1<br />
</syntaxhighlight><br />
<br />
'''Note: '''Make sure not to comment out the line containing "start_jupyter.sh", as doing so will prevent JupyterLab from launching on startup, disabling your access to interactive tutorials.<br><br />
<br />
To exit and save the changes, after editing the content of crontab, press Ctrl + X to exit nano. Since you have made edits to the crontab file, it will ask if you want to save the changes (Save modified buffer?). Enter the letter 'Y' and then press Enter to exit and save the modifications.<br><br />
<br />
After restarting the device, the main program will no longer run automatically upon boot-up, allowing you to freely use the tutorials within JupyterLab. If you wish to re-enable the automatic execution of the main program at startup in the future, you can reopen the crontab file using the method described above. Simply remove the "#" symbol in front of the '@' symbol, then exit and save the changes. This will restore the automatic startup functionality of the main program.<br />
<br />
== Chassis Control Demo==<br />
In the following demo, we use the is_raspberry_pi5() function to determine the model of the Raspberry Pi since the device names for the GPIO serial port differ between Raspberry Pi 4B and Raspberry Pi 5. It's crucial to use the correct device name and baud rate (default 115200) that matches the sub-controller.<br><br />
Before executing the code block below, ensure the robot is elevated so that the drive wheels are off the ground. Activating the code will cause the robot to move; take precautions to prevent it from falling off the table.<br />
<syntaxhighlight lang="python"><br />
from base_ctrl import BaseController<br />
import time<br />
<br />
# Function for Detecting Raspberry Pi<br />
def is_raspberry_pi5():<br />
with open('/proc/cpuinfo', 'r') as file:<br />
for line in file:<br />
if 'Model' in line:<br />
if 'Raspberry Pi 5' in line:<br />
return True<br />
else:<br />
return False<br />
<br />
# Determine the GPIO Serial Device Name Based on the Raspberry Pi Model<br />
if is_raspberry_pi5():<br />
base = BaseController('/dev/ttyAMA0', 115200)<br />
else:<br />
base = BaseController('/dev/serial0', 115200)<br />
<br />
# The wheel rotates at a speed of 0.2 meters per second and stops after 2 seconds.<br />
base.send_command({"T":1,"L":0.2,"R":0.2})<br />
time.sleep(2)<br />
base.send_command({"T":1,"L":0,"R":0})<br />
</syntaxhighlight><br />
<br />
By invoking the code block mentioned above, the Raspberry Pi will initially send the command {"T":1,"L":0.2,"R":0.2} (the structure of commands will be discussed in more detail in later chapters). This command starts the wheels turning. After a two-second interval, the Raspberry Pi will send another command {"T":1,"L":0,"R":0}, causing the wheels to stop. It's important to note that even if the command to stop the wheels isn't sent, the wheels will still cease turning if no new commands are issued. This is due to a heartbeat function built into the sub-controller. The purpose of this heartbeat function is to halt the current motion command automatically if the host controller hasn't sent any new commands to the sub-controller for an extended period. This function is designed to prevent continuous motion of the sub-controller in case the host encounters a problem that leads to a freeze or crash.<br />
<br />
If you want the robot to continue moving indefinitely, the master control unit needs to cyclically send motion control commands every 2 to 4 seconds.<br />
<br />
== Chassis Steering Principle==<br />
The earlier demo allows you to make the robot move forward for two seconds before stopping. Further adjustments to the parameters can control the direction of the chassis, based on the differential steering principle. <br />
<br />
When turning, the inner wheels (on the side towards which the turn is made) travel a shorter distance and thus should rotate slower to maintain stability. The differential gear system achieves this by allowing the drive wheels to rotate at different speeds. Usually, the outer wheels (on the opposite side of the turn) rotate faster than the inner wheels. This variation in speed results in the turning motion of the vehicle, allowing it to steer in the intended direction.<br />
<br />
You can control the vehicle's steering by assigning different target linear velocities to each wheel, thus achieving maneuverability and easily adjustable turning radii.</div>Eng52https://www.waveshare.com/wiki/01_Introduction_to_JupyterLab_And_Robotics_Basics01 Introduction to JupyterLab And Robotics Basics2024-03-21T07:04:29Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
=Introduction to Robotics Basics=<br />
== Control Architecture ==<br />
Our robot employs a dual-controller architecture (akin to a brain-like structure), with the master unit potentially being a Raspberry Pi, Jetson Nano, Orin Nano, or other similar single-board computers equipped with a 40PIN interface. The sub-controller utilizes an ESP32 to control the robot's peripherals, read sensor data, and manage closed-loop speed control for the motors using PID controllers. <br><br />
The dual controllers communicate through serial connections using JSON-formatted instructions. For more specific communication content, users can refer to the documentation of the sub-controller. For a new beginner, you do not need to understand those commands, you only need to follow the tutorial documentation to call common commands or encapsulated functions.<br />
=== Leveraging the Advantages of a "Dual-controller" Architecture===<br />
1. Raspberry Pi or other single-board computers as the primary processing unit handle complex tasks, such as processing visual information, and the ESP32 as a secondary controller dedicated to managing peripherals and sensors, exemplifies a modular design that enhances system flexibility and scalability. <br><br />
<br />
2. The single-board computers are tasked with higher-level processing and decision-making, while the ESP32 handles real-time, low-level tasks such as motor control. This division of tasks allows each component to focus on its functionality, optimizing the system's overall performance.<br><br />
<br />
3. This architecture efficiently allocates processing power and I/O resources, reducing the strain on a singular system and enhancing efficiency. <br><br />
<br />
4. Communication between components is facilitated through serial connections using JSON, which not only streamlines data transfer but also improves readability, easing the debugging and expansion processes.<br><br />
<br />
5. For hobbyists and makers with limited budgets, this "dual-controller" architecture strategy offers a way to maintain high performance while minimizing the costs and complexity of the system.<br />
<br />
= Basic Tutorial on Interactive Development Using JupyterLab =<br />
== What is JupyterLab? ==<br />
* Interactive Development Environment: JupyterLab is an open-source interactive development environment that offers an easy-to-use interface for coding, running experiments, and viewing data.<br />
* Ideal for Data Science and Machine Learning: Initially designed for data science and machine learning, its flexibility and ease of use also make it an excellent choice for robotic programming and experimentation.<br />
* Based on Web Technologies: JupyterLab integrates seamlessly with web technologies, providing a robust platform for various computing tasks.<br />
<br />
== Benefits of Developing with JupyterLab==<br />
1. The User-Friendly Programming Environment of JupyterLab: <br><br />
JupyterLab's clean and intuitive user interface makes programming and experimentation more accessible to beginners. Its interactive notebooks allow for easy code writing and testing, ideal for novices to explore and learn gradually.<br><br />
<br />
2. Immediate Feedback and Visualization of Results: <br><br />
JupyterLab provides instant feedback, enabling users to see the effects of code changes immediately, which is invaluable for debugging and learning. Its convenient data visualization capabilities aid in understanding the behaviors and performance of robots.<br><br />
<br />
3. Support for Multiple Programming Languages: <br><br />
JupyterLab accommodates various programming languages, including Python, offering flexibility for users of all skill levels.<br><br />
<br />
4. Customization and Extension Capabilities: <br><br />
JupyterLab's high customizability and extensibility allow users to add new features or tools as needed.<br><br />
5. Cross-Platform Accessibility<br><br />
As a web-based tool, JupyterLab boasts excellent cross-platform capabilities, running on different operating systems and accessible through a browser.<br />
==Basic Usage of JupyterLab ==<br />
* Learning Resources: Refer to JupyterLab's official documentation for a comprehensive learning guide: https://jupyterlab.readthedocs.io/en/latest/getting_started/overview.html<br />
* Given that our interactive tutorials are conducted using Jupyter Notebook (.ipynb) files, we will introduce some basic usage techniques here.<br />
===What is a Jupyter Notebook (.ipynb) Document===<br />
* Jupyter Notebooks (.ipynb) are documents that combine executable code with narrative text (Markdown), equations (LaTeX), images, interactive visualizations, and other rich output elements.<br />
=== Switching Document Themes===<br />
* Our default theme is the light-colored Jupyter Dark.<br />
* You can switch to a dark theme based on your preference by clicking on Settings - Theme - JupyterLab Dark at the top of the interface.<br />
=== COMMAND / EDIT MODE===<br />
JupyterLab operates in two modes: COMMAND mode and EDIT mode.<br><br />
* COMMAND Mode: <br />
In COMMAND mode, you can quickly perform notebook-wide operations, such as adding or deleting cells, moving cells, changing cell types, etc. In this mode, the cell border is gray. You can enter COMMAND mode by pressing the Esc key.<br />
* EDIT Mode: <br />
EDIT mode allows you to enter or modify code or text within a cell. In this mode, the cell border is blue. You can enter EDIT mode by clicking inside a selected cell or pressing the Enter key.<br />
=== Cell Operations ===<br />
In JupyterLab, you can perform the following operations:<br />
* In COMMAND mode, use the up and down arrow keys to select cells.<br />
* Add Cell Below: You can add a new cell below the current cell by clicking the "+" button on the toolbar or using the shortcut key B (in COMMAND mode).<br />
* Add Cell Above: You can add a new cell above the current cell by clicking the "+" button on the toolbar or using the shortcut key A (in COMMAND mode).<br />
* Delete Cell: Press D,D (press the D key twice in quick succession in COMMAND mode) to delete the currently selected cell.<br />
* Copy Cell: Use the shortcut key C in COMMAND mode.<br />
* Paste Cell: Use the shortcut key V in COMMAND mode.<br />
* Cut Cell: Use the shortcut key X in COMMAND mode.<br />
* Undo: Use the shortcut key Z in COMMAND mode.<br />
* Redo: Use the shortcut key Shift + Z in COMMAND mode.<br />
* Convert the Current Cell to Code Block: Use the shortcut key Y in COMMAND mode.<br />
* Convert the Current Cell to Markdown: Use the shortcut key M in COMMAND mode.<br />
* Switch Cell Type: You can set a cell as a code cell, Markdown cell, or raw cell. This can be done using the toolbar dropdown menu or the shortcut keys Y (for code cell) and M (for Markdown cell) in COMMAND mode.<br />
* Run Cell: You can execute the current cell and automatically move to the next cell by clicking the "▶︎" button on the toolbar or using the shortcut key Shift + Enter.<br />
===Saving and Exporting ===<br />
* Save Notebook: You can save your notebook by clicking the "💾" button on the toolbar or using the shortcut key S (in COMMAND mode).<br />
* Export Notebook: JupyterLab allows you to export notebooks in various formats, including HTML, PDF, Markdown, etc. This can be done through the File > Export Notebook As... menu option.<br />
<br />
===What is a JupyterLab Kernel?===<br />
* A JupyterLab Kernel acts as a computational engine that executes the code written by users in their notebooks.<br />
* Each notebook is linked to a Kernel, which can be programmed in various languages such as Python, R, or Julia. Kernels also have access to resources like memory and CPU.<br />
<br />
===Setting the Kernel for a Robotic Virtual Environment===<br />
* When opening subsequent .ipynb tutorial documents, you'll need to manually select the Kernel in the notebook to ensure the robot-related code blocks execute correctly.<br />
* To do this, click on the Kernel option next to the "⭕" at the top right corner of the notebook tab and choose Python 3 (ipykernel) from the dropdown menu.<br />
<br />
===Kernel Management===<br />
* Starting: When you open a Jupyter notebook, the associated Kernel will automatically start, indicated by a small green dot appearing in front of the corresponding note in the file list.<br />
* Restarting: If the Kernel crashes or you need to clear the current session's state, you can restart the Kernel via "Kernel" -> "Restart Kernel...".<br />
* Stopping: To stop the Kernel of a running note, go to "Kernel" -> "Shut Down Kernel" within the note interface. To stop all Kernels, use "Kernel" -> "Shut Down All Kernels".<br />
Note: If a notebook's Kernel is using the camera and isn't stopped, it will continue to occupy this resource, preventing other notebooks from using it normally. Stopping the notebook's Kernel is necessary for others to function correctly.<br />
===Running Code Blocks===<br />
After selecting the correct Kernel, you can run code blocks within the notebook. In JupyterLab, code blocks are fundamental components of a notebook. Here's how to execute them:<br />
* Run a Single Code Block: Select the code block you wish to run and click the "▶︎" button on the toolbar or use the Shift + Enter shortcut. This action will execute the current code block and select the next one.<br />
* Run All Code Blocks: You can also execute all code blocks in the entire notebook by clicking the Run menu on the toolbar and selecting Run All Cells.<br />
* Stop a Code Block: To stop a code block that's running, click the "■" button on the toolbar.<br />
<br />
These basic operations enable the efficient use of JupyterLab for various tasks. For more advanced features and detailed usage guidelines, refer to JupyterLab's official documentation.<br />
<syntaxhighlight lang="python"><br />
print("test text in jupyterlab")<br />
</syntaxhighlight><br />
* Run all code blocks: You can also run all code blocks in the entire notebook. To do this, click the Run menu in the toolbar and select Run All Cells.<br />
<syntaxhighlight lang="python"><br />
for i in range(0, 10):<br />
print(i)<br />
</syntaxhighlight><br />
* Stopping a Code Block's Execution: If you need to stop a code block that's currently running, you can click the "■" button on the toolbar.<br />
With these basic operational methods, you can effectively use JupyterLab for various tasks. More advanced features and detailed guides can be found in JupyterLab's official documentation.<br />
=== Deleting Code Block Outputs:===<br />
* To delete the output of a single code block, select the block and then click Edit - Clear Cell Output from the menu above.<br />
* To delete the outputs of all code blocks, click Edit - Clear Outputs of All Cells from the menu above.<br />
<br />
=== For More Advanced Content ===<br />
* You can refer to JupyterLab's official documentation for further learning: https://jupyterlab.readthedocs.io/en/latest/getting_started/overview.html</div>Eng52https://www.waveshare.com/wiki/ESP32-C6-ZeroESP32-C6-Zero2024-03-18T08:13:25Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|name=ESP32-C6-Zero<br />
|name2=with pin header<br />
|img=[[File:ESP32-C6-Zero00.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/esp32-c6-zero.htm}}]]<br />
|img2=[[File:ESP32-C6-Zero-2.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/esp32-c6-zero.htm?sku=26976}}]]<br />
|caption2=ESP32-C6<br/>USB Type-C<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
=Overview=<br />
The ESP32-C6-Zero is a low-cost, high-performance microcontroller development board with a compact size and rich peripheral interfaces.<br/><br />
Adopts ESP32-C6-MINI-1 as the main chip, with RISC-V 32-bit single-core processor, support up to 160 MHz, and built-in 320KB ROM 512KB HP SRAM, 16KB LP SRAM, and 4 MB flash. Onboard standard Raspberry Pi Pico pin headers, it can be compatible with multiple peripheral devices and is easier to use in different application scenarios.<br><br />
You can choose ESP-IDF and Arduino in software so that you can easily and quickly get started and apply it to the product.<br />
==Features==<br />
*Adopts ESP32-C6-MINI-1 as the main chip, with RISC-V 32-bit single-core processor, support up to 160 MHz.<br />
*Integrated 320KB ROM, 512KB HP SRAM, 16KB LP SRAM, and 4MB Flash memory. <br />
*Integrated 2.4GHz WiFi 6 and BLE (Bluetooth LE) dual-mode wireless communication, with superior RF performance.<br />
*Type-C connector, easier to use.<br />
*Rich peripheral interfaces, including standard Raspberry Pi Pico interfaces, better compatibility and expandability.<br />
*Castellated module allows soldering directly to carrier boards.<br />
*Support multiple low-power modes, easy to adjust communication distance, data rate, and power consumption, to meet various application scenarios with power consumption requirements.<br />
*<font color = "red">Please use the provided " WS_TCA9554PWR " file to set GPIO22(SDA)and GPIO23(SCL)as I2C functions for GPIO extension to make the device function complete.</font><br />
*<font color = "red">Please note that GPIO22 (SDA) and GPIO23(SCL) are used for TCA9554PWR, and you only can connect these pins to I2C slave device, and do not connect to others.</font><br />
==Function Diagram==<br />
[[File:ESP32-C6-Zero-Diagram.png]]<br />
<br />
==Onboard Interfaces==<br />
[[File:ESP32-C6-Zero Inter.png]]<br />
==Pinout==<br />
[[File:ESP32-C6-Zero Pin.png]]<br />
==Dimensions==<br />
[[File:ESP32-C6-Zero dim.jpg]]<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
=Working with ESP-IDF=<br />
<div class="mw-collapsible-content"><br />
'''The following development system defaults to Windows, and it is recommended to use the VSCode plug-in for development.'''<br />
==Develop with VSCode==<br />
===Install VSCode===<br />
*Open the [https://code.visualstudio.com/download VSCode website] to download according to the corresponding system and system bits. <br />
[[File: ESP32-C6-DEV-KIT-N8-01.png]]<br />
*After running the installation package, the rest can be installed by default, but here for a better experience, it is recommended to check the box in the 1, 2, and 3 items.<br />
**After enabling the 1 and 2 items, you can directly open the VScode by right-clicking the file or the directory to improve your experience.<br />
**After enabling the 3 items, you can directly select VSCode when choosing how to open.<br />
[[File: ESP32-C6-DEV-KIT-N8-02.png]]<br />
<br />
===Install Espressif IDF Plug-in===<br />
*''' Note: Currently the latest version of the plugin is V1.6.4, users can choose the same version as us for a consistent experience!'''<br />
*Open VSCode, use '''Shift+Ctrl+X''' to enter the plug-in manager.<br />
[[File:ESP32-C6-DEV-KIT-N8-03.png]]<br />
*In the search bar, enter '''Espressif IDF''' to select the corresponding plug-in and click "Install".<br />
[[File:ESP32-C6-DEV-KIT-N8-04.png]]<br/><br />
[[File:ESP32-C6-DEV-KIT-N8-05.png]]<br />
*Press '''F1''' to input:<br />
<pre>esp-idf: configure esp-idf extension</pre><br />
[[File:ESP32-C6-DEV-KIT-N8-06.png]]<br />
*Select express (this guide is for users who install it for the first time).<br />
[[File:ESP32-C6-DEV-KIT-N8-07.png]]<br />
*Select download sever.<br />
[[File:ESP32-C6-DEV-KIT-N8-08.png]]<br />
*Select the version of ESP-IDF you want to use now, we choose the latest V5.1.1 (note that only after V5.1 did ESP-IDF start to support ESP32-C6).<br />
[[File:ESP32-C6-DEV-KIT-N8-09.png]]<br />
*The following two are the installation paths respectively for the ESP-IDF container directory and the ESP-IDF Tools directory.<br />
[[File:ESP32-C6-DEV-KIT-N8-10.png]]<br />
*''' Note: If you have installed ESP-IDF before, or failed to do so, please be sure to delete the file completely or create a new path without Chinese.'''<br />
*After configuring, click "Install" to download:<br />
[[File:ESP32-C6-DEV-KIT-N8-19.png]]<br />
*Enter the download interface, and then it will automatically install the corresponding tools and environment, just wait for a second.<br />
[[File:ESP32-C6-DEV-KIT-N8-11.png]]<br />
*After the installation is complete, you will enter the following interface, indicating that the installation is finished.<br />
[[File:ESP32-C6-DEV-KIT-N8-12.png]]<br />
===Official Demo Usage GUIDE===<br />
===='''Create Demo''' ([[#Demo Example]])====<br />
*Press '''F1''' to enter:<br />
esp-idf:show examples projects<br />
[[File:ESP32-C6-DEV-KIT-N8-13.png]]<br />
*Select the corresponding IDF version:<br />
[[File:ESP32-C6-DEV-KIT-N8-14.png]]<br />
*Take the Hello World demo as an example:<br />
**①Select the corresponding demo.<br />
**②Its readme will state what chip the demo applies to (how the demo is used with the file structure is described below, omitted here).<br />
**③Click to create the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-15.png]]<br/><br />
Select the path to place the demo, and the folder name should be aligned with the demo name.<br/><br />
[[File:ESP32-C6-DEV-KIT-N8-16.png]]<br />
<br />
====Modify COM Port====<br />
*The corresponding COM ports are shown here, click to modify them.<br />
*Please select the COM ports according to your device. '''It is recommended to prioritize the use of the COM port corresponding to USB (can be viewed through the device manager).'''<br />
*In case of a download failure, please press the reset button for more than 1 second and wait for the PC to recognize the device again before downloading once more.<br />
[[File: ESP32-C6-DEV-KIT-N8-17.png]]<br />
*Select the project or demo to use:<br />
[[File:ESP32-C6-DEV-KIT-N8-18.png]]<br />
*Then we finish the modification of the COM ports.<br />
<br />
====Modify the Driver Object====<br />
*The driver object is displayed here, and you can modify it by clicking on it.<br />
*Select the project or demo to use.<br />
[[File:ESP32-C6-DEV-KIT-N8-20.png]]<br />
*Wait for a minute after clicking.<br />
[[File:ESP32-C6-DEV-KIT-N8-21.png]]<br />
*Select the object we need to drive, which is our main chip ESP32C6.<br />
[[File:ESP32-C6-DEV-KIT-N8-22.png]]<br />
*Choose the path to openocd, it doesn't affect us here, so let's just choose one at random.<br />
[[File:ESP32-C6-DEV-KIT-N8-23.png]]<br />
====The Rest of the Status Bar====<br />
*①SDK configuration editor, supports modifying most functions of ESP-IDF.<br />
*②All cleanup, and clear all compiled files.<br/><br />
*③Compile.<br />
*④Current download mode, default is UART.<br />
*⑤Flash the current firmware, please do it after compiling.<br />
*⑥Open the serial port monitor, used to view the serial port information.<br />
*⑦All-in-one button, compile, burn, open the serial monitor (most commonly used for debugging).<br />
:[[File:ESP32-C6-DEV-KIT-N8-24.png]]<br />
====Compile, Program, Serial Port Monitoring====<br />
*Click on the all-in-one button we described before to compile, program, and open the serial port monitor.<br />
:[[File:ESP32-C6-DEV-KIT-N8-25.png]]<br />
*It may take a long time to compile especially for the first time.<br />
:[[File:ESP32-C6-DEV-KIT-N8-26.png]]<br />
*''' During this process, the ESP-IDF may take up a lot of CPU resources, so it may cause the system to lag.'''<br />
*Uploading the demo for a new project for the First time, you will need to select the download method, and select '''UART'''.<br />
:[[File:ESP32-C6-DEV-KIT-N8-27.png]]<br />
*This can also be changed later in the '''Download Methods''' section (click on it to bring up the options).<br />
:[[File:ESP32-C6-DEV-KIT-N8-28.png]]<br />
*As it comes with the onboard automatic download circuit, there is no need for manual operation to download automatically.<br />
:[[File:ESP32-C6-DEV-KIT-N8-29.png]]<br />
*After successful download, automatically enter the serial monitor, you can see the chip output the corresponding information and be prompted to restart after 10S.<br />
:[[File:ESP32-C6-DEV-KIT-N8-30.png]]<br />
<br />
===Demo Example===<br />
====Hello World====<br />
The official example path: get-started -> hello_world.<br/><br />
The example effect: Output '''Hello World!''' on the '''TERMINAL''' window every 10s.<br/><br />
<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Create the official example "hello_world" according to the above tutorial. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6. and you can directly use it with no need to modify the demo.<br />
*Modify the COM port and the driver object, click on the compile and burn to run the demo.<br />
[[File: ESP32-C6-DEV-KIT-N8-30.png]]<br />
<br />
====RGB====<br />
Official example path: get-started -> blink.<br/><br />
Sample effect: onboard RGB beads blink at 1-second intervals.<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Follow the tutorial above to create the official example blink. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Modify the COM port and the driver object '''(It is recommended to prioritize the use of the COM port corresponding to USB (can be viewed through the device manager))''', click compile, and burn to run the demo.<br />
:[[File:ESP32-C6-DEV-KIT-N8-41.png]]<br />
<br />
====UART====<br />
Official example path: peripherals -> uart-> uart_async_rxtxtasks<br/><br />
Example effect: shorting GPIO4 and GPIO5 to send/receive UART data.<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Hardware Connection'''=====<br />
<br />
:{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;width:350px;" |ESP32-C6<br />
|style="background:green; color:white;text-align:center;" |ESP32-C6 (the same one)<br />
|-<br />
|style="text-align:center;" |GPIO4<br />
|style="text-align:center;" |GPIO5<br />
|}<br />
<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Create the official example uart_async_rxtxtasks according to the tutorial above. ([[#Official_Demo_Usage_GUIDE|Create Example]]).<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Modify the COM port and driver object, '''(It is recommended to prioritize the use of the COM port corresponding to USB (can be viewed through the device manager)).''' click compile, and flash to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-77.png]]<br />
*Hardware connection according to the GPIO used.<br />
[[File:ESP32-C6-DEV-KIT-N8-78.png]]<br/><br />
*You can go to the definition file to see the actual GPIOs used (check '''GPIO_NUM_4''' -> Right click -> '''Go to Definition''').<br />
[[File:ESP32-C6-DEV-KIT-N8-79.png]]<br/><br />
<br />
====Bluetooth====<br />
Official sample path: bluetooth -> bluedroid -> ble -> gatt_server.<br/><br />
Example effect: ESP32-C6 and cell phone Bluetooth debugging assistant for data transmission.<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Install the [https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/ESP32-C6_TO_BLEAssist.ZIP Bluetooth debugging assistant] on your phone.<br />
*Follow the tutorial above to create the official example gatt_server. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Bluetooth name and UUID, Bluetooth name is '''ESP_GATTS_DEMO'''.<br />
[[File:ESP32-C6-DEV-KIT-N8-50.png]]<br />
*Modify the COM port and the driver object, '''(It is recommended to prioritize the use of the COM port corresponding to USB (can be viewed through the device manager)).''' click on the compile, and burn to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-51.png]]<br />
*Connecting the ESP_GATTS_DEMO Bluetooth device on the phone.<br />
*The effect of a successful connection is shown below:<br />
[[File:ESP32-C6-DEV-KIT-N8-52.png]]<br />
*Based on the UUID value in the demo, select one of the two servers for upstream transmission.<br />
*The ESP32-C6 receives data:<br />
[[File:ESP32-C6-DEV-KIT-N8-54.png]]<br />
<br />
====WIFI====<br />
Official example path: wifi -> getting_started -> station.<br/><br />
Sample effect: ESP32-C6 connects to WIFI.<br/><br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Create the official example station according to the tutorial above. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*Modify the contents of the demo to connect to the required WiFi.<br />
*Go to the '''Kconfig.projbuild''' file.<br />
[[File:ESP32-C6-DEV-KIT-N8-60.png]]<br />
*Change the original '''WiFi SSID''' and '''WiFi Password''' to the WiFi information you want to connect to.<br />
[[File:ESP32-C6-DEV-KIT-N8-61.png]]<br />
*Modify the COM port and the driver object, '''It is recommended to prioritize the use of the COM port corresponding to USB (can be viewed through the device manager)).''' click on the compile, and upload to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-62.png]]<br />
*You can check the value of '''CONFIG_ESP_WIFI_SSID'''.<br />
*Go to the '''station_ example_ main.c''' file.<br />
[[File:ESP32-C6-DEV-KIT-N8-63.png]]<br />
*Right-click to Go to Definition.<br />
[[File:ESP32-C6-DEV-KIT-N8-64.png]]<br />
*The previously set value can be seen as:<br />
[[File:ESP32-C6-DEV-KIT-N8-65.png]]<br />
<br />
====Zigbee====<br />
*Official example 1 path: Zigbee -> light_sample -> HA_on_off_switch.<br />
*Official example 2 path: Zigbee -> light_sample -> HA_on_off_light.<br />
*Example effect: use 2x ESP32-C6 boards, use the BOOT key of one ESP32-C6 board (upload HA_on_off_switch demo) to control the RGB LED ON/OFF on the other one.<br />
*'''Note: Please upload the HA_on_off_switch demo to one board first, and then flash the HA_on_off_light demo to the other board.'''<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation 1'''=====<br />
*Create the official example HA_on_off_switch according to the tutorial above. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Modify the COM port and the driver object. '''It is recommended to prioritize the use of the COM port corresponding to USB (can be viewed through the device manager.))''' click compile, and burn to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-66.png]]<br />
<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation 2'''=====<br />
*Follow the tutorial above to create the official example HA_on_off_light. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo.<br />
*Modify the COM port and driver object, click compile and burn to run the demo ('''you need to wait for a moment for the two chips to establish a connection''').<br />
:[[File:ESP32-C6-DEV-KIT-N8-70.png]]<br />
*'''If the device remains unconnected''', it may be due to residual network information on the device, so you can erase the device information ([[#Erase Device Flash|Erase Tutorial]]) and reorganize the network.<br />
:[[File:ESP32-C6-DEV-KIT-N8-71.png]]<br />
<br />
====JTAG Debug====<br />
=====&nbsp;&nbsp;&nbsp;&nbsp; '''Software Operation'''=====<br />
*Create a debugging example, this example uses the official example hello_world. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*Modify the '''launch.json''' file.<br />
[[File:ESP32-C6-DEV-KIT-N8-72.png]]<br />
*Input the following content:<br />
<pre><br />
{<br />
"version": "0.2.0",<br />
"configurations": [<br />
{<br />
"name": "GDB",<br />
"type": "cppdbg",<br />
"request": "launch",<br />
"MIMode": "gdb",<br />
"miDebuggerPath": "${command:espIdf.getXtensaGdb}",<br />
"program": "${workspaceFolder}/build/${command:espIdf.getProjectName}.elf",<br />
"windows": {<br />
"program": "${workspaceFolder}\\build\\${command:espIdf.getProjectName}.elf"<br />
},<br />
"cwd": "${workspaceFolder}",<br />
"environment": [{ "name": "PATH", "value": "${config:idf.customExtraPaths}" }],<br />
"setupCommands": [<br />
{ "text": "target remote :3333" },<br />
{ "text": "set remote hardware-watchpoint-limit 2"},<br />
{ "text": "mon reset halt" },<br />
{ "text": "thb app_main" },<br />
{ "text": "flushregs" }<br />
],<br />
"externalConsole": false,<br />
"logging": {<br />
"engineLogging": true<br />
}<br />
}<br />
]<br />
}<br />
</pre><br />
[[File:ESP32-C6-DEV-KIT-N8-73.png]]<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Modify the COM port and the driver object ('''Please use the USB interface; the UART interface does not support JTAG debugging. The corresponding COM port can be checked through the Device Manager.'''), click compile, and flash to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-30.png]]<br />
*Press F1 and input:<br />
ESP-IDF:Device configuration<br />
[[File:ESP32-C6-DEV-KIT-N8-75.png]]<br />
*Select ''' OpenOcd Config Files'''.<br />
[[File:ESP32-C6-DEV-KIT-N8-76.png]]<br />
*Type '''board/esp32c6-builtin.cfg''' (if this is the default, just enter)<br />
board/esp32c6-builtin.cfg<br />
[[File:ESP32-C6-DEV-KIT-N8077.png]]<br />
*Stretch the width of the window until '''[OpenOCD Server]''' is displayed at the bottom.<br />
[[File:ESP32-C6-DEV-KIT-N8078.png]]<br />
*Click '''[OpenOCD Server]''' and select '''Start OpenOCD'''.<br />
[[File:ESP32-C6-DEV-KIT-N8079.png]]<br />
*Successfully opened as follows:<br />
[[File:ESP32-C6-DEV-KIT-N8-80.png]]<br />
*Go to the debug function and click Debug:<br />
[[File:ESP32-C6-DEV-KIT-N8-81.png]]<br />
*Successfully enter the debugging interface:<br />
[[File:ESP32-C6-DEV-KIT-N8-82.png]]<br />
<br />
===Erase Device Flash===<br />
*Unpack the software resource package ([https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/Flash_download_tool_3.9.5_0.zip Flash debugging software]).<br />
*Open '''flash_download_tool_3.9.5.exe''' software, select ESP32-C6 and UART.<br />
[[File:ESP32-C6-DEV-KIT-N8-83.png]]<br />
*Select the UART port number, and click '''Start''' (not select any bin file).<br />
[[File:ESP32-C6-DEV-KIT-N8-84.png]]<br />
*After programming, click on "ERASE".<br />
[[File:ESP32-C6-DEV-KIT-N8-85.png]]<br />
*Waiting for Erase to Finish.<br />
[[File:ESP32-C6-DEV-KIT-N8-86.png]]<br />
</div><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
=Working with Arduino=<br />
<div class="mw-collapsible-content"><br />
'''Please note that Arduino 3.0.0-alpha is based on ESP-IDF v5.1, which is quite different from the previous ESP-IDF V4.X. The original program may need to be adjusted after the following operations.'''<br />
==Environment Set-up==<br />
*Install [https://www.arduino.cc/en/software/ Arduino IDE].<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino01.png]]<br />
*Enter Arduino IDE after installation.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino02.png]]<br />
*Enter Preferences.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino03.png]]<br />
*Add JSON link:<br />
https://espressif.github.io/arduino-esp32/package_esp32_dev_index.json<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino04.png]]<br/>[[File: ESP32-C6-DEV-KIT-N8-Arduino05.png]]<br />
*Modify the project file folder as '''C:\Users\Waveshare\AppData\Local\Arduino15\packages''' (Waveshare is the username).<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino06.png]]<br />
*Enter the development board manager, search for "esp32", select version 3.0.0-alpha3 under "esp32 by Espressif Systems" below, and click to install. (If installation fails, try using a mobile hotspot.)<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino07.png]]<br/>[[File: ESP32-C6-DEV-KIT-N8-Arduino08.png]]<br />
*Restart the Arduino IDE after installation, and then you can use it now.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino09.png]]<br />
===If the installation fails===<br />
*Failed to install version 3.0.0-alpha3:<br />
[[File:ESP32-C6-DEV-KIT-N8-install.png]]<br />
*Download [https://drive.google.com/file/d/19gMF3WHr4OreN26u2jYtGXKotRn8K16l/view?usp=sharing the resource file]:<br />
[[File:ESP32-C6-DEV-KIT-N8-install02.png]]<br />
*Click on the path "c:\Users\Waveshare\AppData\Local\Arduino15\packages" (where Waveshare is the user name of the computer, and you need to turn on Show Hidden Files).<br />
[[File:ESP32-C6-DEV-KIT-N8-install03.png]]<br />
*Unzop the downloaded files to the packages file folder:<br />
[[File:ESP32-C6-DEV-KIT-N8-install04.png]]<br />
*Install it again:<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino07.png]]<br />
*Restart the Arduino IDE after installation and you're ready to go!<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino09.png]]<br />
<br />
==Create Example==<br />
*After changing the project folder above to '''c:\Users\Waveshare\AppData\Local\Arduino15\packages ("Waveshare" is the computer username)''', you can create demos using the examples in the project folder under the files.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino10.png]]<br />
*The following is the RGB flashing example (File -> Sketchbook -> esp32 -> hardware -> esp32 -> 3.0.0-alpha3 -> libraries -> ESP32 -> examples -> BlinkRGB under GPIO).<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino11.png]]<br />
*Select the development board and port.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino12.png]]<br />
*Search esp32c6, select ESP32C6 Dev Module, and the ports to download.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino13.png]]<br />
*After selecting, click to upload and Arduino IDE will start to compile and flash the demo.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino14.png]]<br />
*After uploading, you can see the effect on the development board.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino15.png]]<br />
</div><br />
=Resource=<br />
==Software==<br />
===Compile===<br />
* [https://code.visualstudio.com/download VScode] <br />
* [https://www.arduino.cc/en/software Arduino IDE]<br />
===UART===<br />
*[https://files.waveshare.com/wiki/LC29H(XX)-GPS-RTK-HAT/Sscom5.13.1.zip SSCOM5.13.1]<br />
===Flash===<br />
*[https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/Flash_download_tool_3.9.5_0.zip Flash]<br />
===Bluetooth===<br />
*[https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/ESP32-C6_TO_BLEAssist.ZIP Bluetooth debugging assistant]<br />
==Schematic==<br />
*[https://files.waveshare.com/wiki/ESP32-C6-Zero/ESP32-C6-Zero-Sch.pdf Schematic diagram]<br />
<br />
==Datasheet==<br />
*[https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/ESP32-C6_Technical_Reference_Manual.pdf ESP32-C6 Technical Reference Manual]<br />
*[https://files.waveshare.com/wiki/ESP32-C6-Pico/ESP32-C6_Series_Datasheet.pdf ESP32-C6 Series Datasheet]<br />
==Official Datasheet==<br />
*[https://docs.espressif.com/projects/esp-idf/en/latest/esp32c6/index.html ESP-IDF Datasheet]<br />
<br />
<br />
=FAQ=<br />
{{FAQ|The module appears to be reset all the time, and the recognition status will be flashing when viewed from the device manager?<br />
|<br />
This situation may be due to Flash blank USB port instability, you can long-press the BOOT button, press RESET at the same time, and then release RESET, and then release the BOOT button, at this time the module can enter the download mode to burn the firmware (program) to solve the situation.<br />
||}}<br />
{{FAQ|After the module downloads the demo and re-downloads it, sometimes it fails to connect to the serial port, or the flashing fails?<br />
|<br />
You can long-press the BOOT button, simultaneously press the RESET button, then release the RESET button, and finally release the BOOT button. This will put the module into download mode and can resolve most download issues.<br />
||}}<br />
{{FAQ|No ESP option below when setting up an environment or building a project?<br />
|<br />
In VSCode, click the shortcut '''F1''', and search for '''Espressif IDF''', you will find that it is designated as an untrusted extension, set it as trusted.<br />
||}}<br />
{{FAQ|Switch to the same ESP model and encounter issues with program burning and program execution?<br />
|<br />
Please select the COM port and driver object again after switching ESP, then compile and burn.<br />
||}}<br />
{{FAQ|After powering up the module, the recognized serial devices and USB ports keep resetting and restarting?<br />
|<br />
Check whether the power supply voltage for the USB port is less than 5V, in general, if it is 4.9V or more, the module's two USB ports can be used normally. If it is lower than 4.9V, the power supply may be insufficient and the USB port may disconnect. In this case, you should replace it with a USB port with sufficient voltage.<br />
||}}<br />
<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/RoArm-M2-S_Secondary_Development_Tool_UsageRoArm-M2-S Secondary Development Tool Usage2024-03-15T03:19:44Z<p>Eng52: /* Install Development Environment */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
<br />
=='''RoArm-M2-S Secondary Development Tool Usage'''==<br />
This tutorial introduces the installation and usage of secondary development tools, explaining how to upload modified demos to RoArm-M2-S. Users can perform secondary development on the open-source robotic arm demo. The development tool introduced here is the Arduino IDE. The following outlines the installation and usage of the environment dependencies, such as libraries, for the robotic arm in the Arduino IDE.<br />
==='''What's Arduino IDE'''===<br />
Arduino IDE (Integrated Development Environment), a development platform based on open-source code, features its own language and development environment. Due to the extensive libraries provided by the Arduino IDE, complex components such as displays and sensors, as well as software platforms, are easy to use.<br />
==='''How To Install'''===<br />
===='''1. Download Arduino IDE'''====<br />
Go to [https://www.arduino.cc/en/software ''' Arduino website'''] to download the latest version. The official IDE supports downloads for different operating systems, so choose the one that matches your operating system. Here, we're downloading for Windows. If it's already installed, skip to the second step. The installation process is straightforward; simply keep clicking "Next" until completed.<br><br />
<br />
[[File:ArduinoIDE1.png|700px]]<br />
<br />
<font style="color:red;">'''Note: During the installation process, you may be prompted to install drivers. Simply continue clicking "Install" until the process is complete.'''</font><br />
<br />
==='''Install Development Environment'''===<br />
The main control module of the drive board on the robotic arm is ESP32, so we need to install the corresponding development board support for ESP32 in the Arduino IDE development environment. The steps are as follows:<br><br />
'''1.''' Open Arduino IDE, click on "File" → "Preferences". <br />
<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino03.png]]<br />
<br />
'''2.''' Add the following link to the Additional Board Manager URLs, then click "OK" to save the settings.<br />
<pre>https://dl.espressif.com/dl/package_esp32_index.json</pre><br />
<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino4.png]]<br><br />
<br />
<font style="color:red;">'''Note: If you need to add multiple development board URLs, you don't need to delete the URL for ESP32 development board support. You can directly add other URLs on another line. By default, URLs are separated by commas on different lines.'''</font><br><br />
For example: If you need to add the URL on the ESP8266 development board, simply add the additional URLs on separate lines, and it will display as follows:<br />
<pre>https://dl.espressif.com/dl/package_esp32_index.json, http://arduino.esp8266.com/stable/package_esp8266com_index.json</pre><br />
<br />
[[File:M2ESP323.png|600px]][[File:M2ESP324.png|600px]]<br />
<br />
<br />
'''3.''' Click to download [https://files.waveshare.com/wiki/RoArm-M2-S/esp32-2.0.11.zip '''ESP32 development package'''] and unzip. Input the following path on "my computer": <br />
<pre>C:\Users\username\AppData\Local\Arduino15</pre><br />
The "username" needs to be changed according to your computer's username. Create a new folder named "packages" and copy the extracted folder of the ESP32 development package into the "packages" folder.<br />
<br />
[[File:M2ESP325.png|700px]]<br />
<br />
You can see that the installed ESP32 development board version from "packages-esp32-hardware-esp32" is version 2.0.11, which matches the version required for the RoArm-M2-S open-source demo development board ESP32.<br />
<br />
<br />
[[File:M2ESP326.png|600px]]<br />
<br />
==='''Install Dependency Library'''===<br />
Download [https://files.waveshare.com/wiki/RoArm-M2-S/Libraries.zip libraries], unzip them, and open the default Arduino installation path: C:\Users\username\AppData\Local\Arduino15\libraries (please according to your actual path), and then copy the files on the figure to libraries file folder. <br />
<br />
[[File:M2ESP327.png|600px]]<br />
<br />
==='''Upload Demo'''===<br />
'''1.''' Download [https://files.waveshare.com/wiki/RoArm-M2-S/RoArm-M2-S_example.zip RoArm-M2-S_example], unzip it and double-click RoArm-M2_example.ino. Note that all files in this directory should be placed within the same folder. <br><br />
<br />
[[File:M2 demo.png|600px]]<br />
<br />
<br />
'''2.''' Click on "Tools" → "Port" and take note of the COM port already listed on your computer. Do not click on this COM port. (At this moment, the displayed COM port on my computer is COM1; the COM ports may differ on different computers.)<br />
<br />
[[File:M2 demo 1.png|600px]]<br />
<br />
<br />
'''3.''' Connect the drive board on the RoArm-M2-S robotic arm to your computer using a USB cable (ensure it's connected to the left USB port). Then, click on "Tools" → "Port" and select the newly appeared COM port (in my case, it's COM29).<br />
<br />
[[File:M2 dem 2.png|600px]]<br />
<br />
<br />
'''4.''' In Arduino IDE, click on "Tool" → "DevBoard" → "ESP32" → "ESP32 Dev Module".<br />
<br />
[[File:M2 demo 3.png|600px]]<br />
<br />
<br />
'''5.''' Click on "Tool", others as shown below: (For the Partition Scheme, it's recommended to use "Huge APP" and ensure that PSRAM is set to "Enabled.") <br />
<br />
[[File:M2 demo 4.png|600px]]<br />
<br />
<br />
'''6.''' Once all settings are configured, click "Upload" to upload the program to the drive board of the robotic arm.<br />
<br />
[[File: M2上传程序5.png|600px]]<br />
<br />
<br />
<font style="color:red;">'''If you encounter problems during the upload process and need to reinstall or change the Arduino IDE version, it's important to uninstall Arduino IDE completely. After uninstalling the software, you'll need to manually delete all contents within the folder "C:\Users\username\AppData\Local\Arduino15" (some hidden files may need to be displayed first). Then, proceed to download and reinstall the Arduino IDE.'''</font></div>Eng52https://www.waveshare.com/wiki/PCIe_TO_Gigabit_ETH_Board_(C)PCIe TO Gigabit ETH Board (C)2024-03-14T06:00:57Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|img=[[File:PCIe TO Gigabit ETH Board (C).jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/pcie-to-gigabit-eth-board-c.htm}}]]<br />
|caption=PCIe TO ETH<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
<br />
=Overview=<br />
==Introduction==<br />
'''PCIe TO Gigabit ETH Board (C) for Raspberry Pi 5, supports Raspberry Pi OS, is driver-free, side-mounting, and comes with an acrylic mounting plate.'''<br />
<br />
==Features==<br />
*PCI-E×1 Gen2 mode.<br />
*Only support PI5B.<br />
*Network working indicator:<br />
**Accessing 1000M network: the green indicator keeps blinking.<br />
**Accessing 100M network: the yellow indicator keeps blinking.<br />
*Equipped with original RTL8111H high-performance controller.<br />
==Note==<br />
*PCIE interface is not enabled on the Raspberry Pi by default.<br />
<br />
=User Guide=<br />
==Hardware Connection==<br />
Pay attention to the wiring direction:<br><br />
[[File:PCIe TO Gigabit ETH Board (C) Hd.png]]<br />
*'''No need for additional power supply when using the PCIE connector by default.'''<br />
<br />
==User Guide==<br />
1: Enable PCIE interface:<br />
PCIE interface is not enabled on the Raspberry Pi 5 by default, you can add the following content at /boot/firmware/config.txt:<br />
dtparam=pciex1<br />
2: PCIE gen2 is the default setting, if you want to enable PCIE gen3, you need to add the following content at /boot/firmware/config.txt:<br />
dtparam=pciex1_gen=3<br />
#Please note that the module only supports gen2, so it is the same to set Gen3 or Gen2 on the PI5, and the speed will not be improved. <br />
3: Reboot PI5 after modification, and the device can be recognized. <br />
As shown below, the RTL8111 is recognized as our device, and the other PI5 is the RPI chip.<br />
[[file:PCIe-TO-Gigabit-ETH-Board-C-1-1.png]]<br />
4: Execute "ifconfig" to get to the point where the NIC has been recognized.<br />
[[file:PCIe-TO-Gigabit-ETH-Board-C-1-2.png]]<br />
5: Use the ping command to test: <br />
ping baidu.com -I eth1<br />
#-I specify the Ethernet port: <br />
[[file:PCIe-TO-Gigabit-ETH-Board-C-1-3.png]]<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/ESP32-C6-PicoESP32-C6-Pico2024-03-14T01:51:50Z<p>Eng52: /* Pinout */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|name=ESP32-C6-Pico<br />
|name2=with pin header<br />
|img=[[File:ESP32-C6-Pico00.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/esp32-c6-pico.htm}}]]<br />
|img2=[[File:ESP32-C6-Pico-2.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/esp32-c6-pico.htm?sku=26845}}]]<br />
|caption2=Main Chip: ESP32-C6<br/>Interface: USB Type-C<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
<br />
=Overview=<br />
==Introduction==<br />
The ESP32-C6-Pico is a low-cost, high-performance microcontroller development board with a compact size and rich peripheral interfaces.<br/><br />
Adopts ESP32-C6-MINI-1 as the main chip, with RISC-V 32-bit single-core processor, support up to 160 MHz, and built-in 320KB ROM 512KB HP SRAM, 16KB LP SRAM, and 4 MB flash. Onboard standard Raspberry Pi Pico pin headers, it can be compatible with multiple peripheral devices and is easier to use in different application scenarios.<br><br />
You can choose ESP-IDF and Arduino in software so that you can easily and quickly get started and apply it to the product.<br />
<br />
==Features==<br />
*Adopts ESP32-C6-MINI-1 as the main chip, with RISC-V 32-bit single-core processor, support up to 160 MHz.<br />
*Integrated 320KB ROM, 512KB HP SRAM, 16KB LP SRAM, and 4MB Flash memory. <br />
*Integrated 2.4GHz WiFi 6 and BLE (Bluetooth LE) dual-mode wireless communication, with superior RF performance.<br />
*Type-C connector, easier to use.<br />
*Rich peripheral interfaces, including standard Raspberry Pi Pico interfaces, better compatibility and expandability.<br />
*Castellated module allows soldering directly to carrier boards.<br />
*Support multiple low-power modes, easy to adjust communication distance, data rate, and power consumption, to meet various application scenarios with power consumption requirements.<br />
*<font color = "red">Please use the provided " WS_TCA9554PWR " file to set GPIO22(SDA)and GPIO23(SCL)as I2C functions for GPIO extension to make the device function complete.</font><br />
*<font color = "red">Please note that GPIO22 (SDA) and GPIO23(SCL) are used for TCA9554PWR, and you only can connect these pins to I2C slave device, and do not connect to others.</font><br />
==Function Block Diagram==<br />
[[File:ESP32-C6-Pico-fun.png]]<br />
==Onboard Interface==<br />
[[File:ESP32-C6-Pico-Inter.png]]<br />
<br />
==Pinout==<br />
[[File:ESP32-C6-Pico-Pinout2.jpg]]<br />
<br />
==Dimensions==<br />
[[File:ESP32-C6-Pico-Dimen.jpg]]<br />
==TCA9554PWR Function Description==<br />
* <font color="red"> Please note that the corresponding library files are required before using EXIO1 ~ EXIO7. ([[#Add EXIO Control Demo in VScode]], [[#Add EXIO Demo In Arduino IDE]])</font><br />
:{|border=1; style="width:1050px;";align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;width:200px;" | Items<br />
|style="background:green; color:white;text-align:center;width:200px;" | Function name<br />
|style="background:green; color:white;text-align:center;" | Parameters<br />
|style="background:green; color:white;text-align:center;" | Functionality<br />
|-<br />
|<br />
|-<br />
!style="text-align:center;"|Initialize TCA9554PWR <br />
|style="text-align:center;"|TCA9554PWR_Init<br />
|style="text-align:center;" |uint8_t PinState<br />
|style="text-align:center;" |Initialize all TCA9554PWR pins to PinState mode <br />
|-<br />
|<br />
|-<br />
!style="text-align:center;" rowspan="2"|Resister<br />
|style="text-align:center;"|Read_REG<br />
|style="text-align:center;" |uint8_t REG<br />
|style="text-align:center;" |Read the value in the REG register of the TCA9554PWR <br />
|-<br />
|style="text-align:center;"|Write_REG<br />
|style="text-align:center;" |uint8_t REG,uint8_t Data<br />
|style="text-align:center;" |Write data to the REG register of the TCA9554PWR <br />
|-<br />
|<br />
|-<br />
!style="text-align:center;" rowspan="2"|Initialize EXIO mode<br />
|style="text-align:center;"|Mode_EXIO<br />
|style="text-align:center;"|uint8_t Pin,uint8_t State<br />
|style="text-align:center;" |Setting the mode of the Pin pin of the TCA9554PWR<br />
|-<br />
|style="text-align:center;"|Mode_EXIOS<br />
|style="text-align:center;" |uint8_t PinState<br />
|style="text-align:center;" |Setting the mode of all TCA9554PWR pins<br />
|-<br />
|<br />
|-<br />
!style="text-align:center;" rowspan="2"|Read EXIO level<br />
|style="text-align:center;"|Read_EXIO<br />
|style="text-align:center;" |uint8_t Pin<br />
|style="text-align:center;" |Read the input level of the Pin of the TCA9554PWR <br />
|-<br />
|style="text-align:center;"|Read_EXIOS<br />
|style="text-align:center;" |void<br />
|style="text-align:center;" |Setting the input level of all TCA9554PWR pins<br />
|-<br />
|<br />
|-<br />
!style="text-align:center;" rowspan="2"|Set EXIO output level<br />
|style="text-align:center;"|Set_EXIO<br />
|style="text-align:center;" |uint8_t Pin,uint8_t State<br />
|style="text-align:center;" |Set the output level of the pin of the TCA9554PWR <br />
|-<br />
|style="text-align:center;"|Set_EXIOS<br />
|style="text-align:center;" |int8_t PinState<br />
|style="text-align:center;" |Setting the output level of all TCA9554PWR pins <br />
|-<br />
|<br />
|-<br />
!style="text-align:center;" rowspan="2"|Toggle EXIO level<br />
|style="text-align:center;"|Set_Toggle<br />
|style="text-align:center;" |uint8_t Pin<br />
|style="text-align:center;" |Toggles the output level of the Pin of the TCA9554PWR <br />
|}<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
=Working with ESP-IDF=<br />
<div class="mw-collapsible-content"><br />
'''The following development system defaults to Windows, and it is recommended to use the VSCode plug-in for development.'''<br />
==Develop with VSCode==<br />
===Install VSCode===<br />
*Open the [https://code.visualstudio.com/download VSCode website] to download according to the corresponding system and system bits. <br />
[[File: ESP32-C6-DEV-KIT-N8-01.png]]<br />
*After running the installation package, the rest can be installed by default, but here for a better experience, it is recommended to check the box in the 1, 2, and 3 items.<br />
**After enabling the 1 and 2 items, you can directly open the VScode by right-clicking the file or the directory to improve your experience.<br />
**After enabling the 3 items, you can directly select VSCode when choosing how to open,<br />
[[File: ESP32-C6-DEV-KIT-N8-02.png]]<br />
<br />
===Install Espressif IDF Plug-in===<br />
*''' Note: Currently the latest version of the plugin is V1.6.4, users can choose the same version as us for a consistent experience!'''<br />
*Open VSCode, use '''Shift+Ctrl+X''' to enter the plug-in manager.<br />
[[File:ESP32-C6-DEV-KIT-N8-03.png]]<br />
*In the search bar, enter '''Espressif IDF''' to select the corresponding plug-in and click "Install".<br />
[[File:ESP32-C6-DEV-KIT-N8-04.png]]<br/><br />
[[File:ESP32-C6-DEV-KIT-N8-05.png]]<br />
*Press '''F1''' to input:<br />
<pre>esp-idf: configure esp-idf extension</pre><br />
[[File:ESP32-C6-DEV-KIT-N8-06.png]]<br />
*Select express (this guide is for users who install it for the first time).<br />
[[File:ESP32-C6-DEV-KIT-N8-07.png]]<br />
*Select download sever.<br />
[[File:ESP32-C6-DEV-KIT-N8-08.png]]<br />
*Select the version of ESP-IDF you want to use now, we choose the latest V5.1.1 (note that only after V5.1 did ESP-IDF start to support ESP32-C6).<br />
[[File:ESP32-C6-DEV-KIT-N8-09.png]]<br />
*The following two are the installation paths respectively for the ESP-IDF container directory and the ESP-IDF Tools directory.<br />
[[File:ESP32-C6-DEV-KIT-N8-10.png]]<br />
*''' Note: If you have installed ESP-IDF before, or failed to do so, please be sure to delete the file completely or create a new path without Chinese.'''<br />
*After configuring, click "Install" to download:<br />
[[File:ESP32-C6-DEV-KIT-N8-19.png]]<br />
*Enter the download interface, and then it will automatically install the corresponding tools and environment, just wait for a second.<br />
[[File:ESP32-C6-DEV-KIT-N8-11.png]]<br />
*After the installation is complete, you will enter the following interface, indicating that the installation is finished.<br />
[[File:ESP32-C6-DEV-KIT-N8-12.png]]<br />
===Official Demo Usage GUIDE===<br />
===='''Create Demo''' ([[#Demo Example]])====<br />
*Press '''F1''' to enter:<br />
esp-idf:show examples projects<br />
[[File:ESP32-C6-DEV-KIT-N8-13.png]]<br />
*Select the corresponding IDF version:<br />
[[File:ESP32-C6-DEV-KIT-N8-14.png]]<br />
*Take the Hello World demo as an example:<br />
**①Select the corresponding demo.<br />
**②Its readme will state what chip the demo applies to (how the demo is used with the file structure is described below, omitted here).<br />
**③Click to create the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-15.png]]<br/><br />
Select the path to place the demo, and the folder name should be aligned with the demo name.<br/><br />
[[File:ESP32-C6-DEV-KIT-N8-16.png]]<br />
<br />
====Modify COM Port====<br />
*The corresponding COM ports are shown here, click to modify them.<br />
*Please select the COM ports according to your device. <br />
*In case of a download failure, please press the reset button for more than 1 second and wait for the PC to recognize the device again before downloading once more.<br />
[[File:ESP32-C6-17.png]]<br />
*Select the project or demo to use:<br />
[[File:ESP32-C6-DEV-KIT-N8-18.png]]<br />
*Then we finish the modification of the COM ports.<br />
<br />
====Modify the Driver Object====<br />
*The driver object is displayed here, and you can modify it by clicking on it.<br />
*Select the project or demo to use.<br />
[[File:ESP32-C6-DEV-KIT-N8-20.png]]<br />
*Wait for a minute after clicking.<br />
[[File:ESP32-C6-DEV-KIT-N8-21.png]]<br />
*Select the object we need to drive, which is our main chip ESP32C6.<br />
[[File:ESP32-C6-DEV-KIT-N8-22.png]]<br />
*Choose the path to openocd, it doesn't affect us here, so let's just choose one at random.<br />
[[File:ESP32-C6-DEV-KIT-N8-23.png]]<br />
====The Rest of the Status Bar====<br />
*①SDK configuration editor, supports modifying most functions of ESP-IDF.<br />
*②All cleanup, and clear all compiled files.<br/><br />
*③Compile.<br />
*④Current download mode, default is UART.<br />
*⑤Flash the current firmware, please do it after compiling.<br />
*⑥Open the serial port monitor, used to view the serial port information.<br />
*⑦All-in-one button, compile, burn, open the serial monitor (most commonly used for debugging).<br />
:[[File:ESP32-C6-DEV-KIT-N8-24.png]]<br />
====Compile, Program, Serial Port Monitoring====<br />
*Click on the all-in-one button we described before to compile, program, and open the serial port monitor.<br />
:[[File:ESP32-C6-DEV-KIT-N8-25.png]]<br />
*It may take a long time to compile especially for the first time.<br />
:[[File:ESP32-C6-DEV-KIT-N8-26.png]]<br />
*''' During this process, the ESP-IDF may take up a lot of CPU resources, so it may cause the system to lag.'''<br />
*Uploading the demo for a new project for the First time, you will need to select the download method, and select '''UART'''.<br />
:[[File:ESP32-C6-DEV-KIT-N8-27.png]]<br />
*This can also be changed later in the '''Download Methods''' section (click on it to bring up the options).<br />
:[[File:ESP32-C6-DEV-KIT-N8-28.png]]<br />
*As it comes with the onboard automatic download circuit, there is no need for manual operation to download automatically.<br />
:[[File:ESP32-C6-DEV-KIT-N8-29.png]]<br />
*After successful download, automatically enter the serial monitor, you can see the chip output the corresponding information and be prompted to restart after 10S.<br />
:[[File:ESP32-C6-DEV-KIT-N8-30.png]]<br />
<br />
===Add EXIO Control Demo in VScode===<br />
* Use the official example blink for modification demonstration.<br />
* Official path: get-started -> blink<br />
* Follow the tutorial above to create an official example: blink ([[#Official Demo Usage GUIDE|Create Demo]])<br />
* Enter the main project directory. <br />
[[File:ESP32-C6 TO VScode EXIO 1.png]]<br />
[[File:ESP32-C6 TO VScode EXIO 2.png]]<br />
* Download [https://files.waveshare.com/wiki/ESP32-C6-Pico/VScode_TCA9554.zip EXIO Control Demo]<br />
[[File:ESP32-C6 TO VScode EXIO 3.png]]<br />
* Copy EXIO control demo to '''main''' project.<br />
[[File:ESP32-C6 TO VScode EXIO 4.png]]<br />
* Use TCA9554PWR file:<br />
#include "TCA9554PWR.h"<br />
[[File:ESP32-C6 TO VScode EXIO 5.png]]<br />
* The current control demo after initializing the TCA9554PWR.<br />
* As follows, add the following demo to realize that EXIO1~EXIO7 outputs a high level in order.<br />
uint8_t count = 0;<br />
TCA9554PWR_Init(0x00);<br />
<br />
Set_EXIOS(0x01<<count); // Set 7 EXIO loops to output high levels<br />
count++; <br />
if(count == 7)<br />
count = 0; <br />
uint8_t State = Read_EXIO(TCA9554_EXIO3); // Read EXIO3's input level <br />
printf("EXIO3: %d\r\n",State); <br />
[[File:ESP32-C6 TO VScode EXIO 6.png]]<br />
*Select the COM port and the driver object, and then flash. <br />
*The effect is as shown below:<br />
[[File:ESP32-C6 TO VScode EXIO 7.png]]<br />
<br />
===Demo Example===<br />
====Hello World====<br />
The official example path: get-started -> hello_world.<br/><br />
The example effect: Output '''Hello World!''' on the '''TERMINAL''' window every 10s.<br/><br />
<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Create the official example "hello_world" according to the above tutorial. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6. and you can directly use it with no need to modify the demo.<br />
*Modify the COM port and the driver object, click on the compile and burn to run the demo.<br />
[[File: ESP32-C6-DEV-KIT-N8-30.png]]<br />
<br />
====GPIO====<br />
Official path: peripherals -> gpio -> generic_gpio.<br/><br />
Sample effect: LED blinks at 1-second intervals.<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Hardware Connection'''=====<br />
:{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;width:350px;" |ESP32-C6<br />
|style="background:green; color:white;text-align:center;" |LED<br />
|-<br />
|style="text-align:center;" |GPIO18 (or GPIO19)<br />
|style="text-align:center;" |LED+<br />
|-<br />
|style="text-align:center;" |GND<br />
|style="text-align:center;" |LED-<br />
|}<br />
<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Follow the tutorial above to create the official example generic_gpio. ([[#Official Demo Usage GUIDE|Create Example]]).<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Modify the COM port and the driver object, click compile and burn to run the demo.<br />
:[[File:ESP32-C6-DEV-KIT-N8-37.png]]<br />
*Go to the demo macro definition location to see what GPIOs are handled.<br />
:[[File:ESP32-C6-DEV-KIT-N8-38.png]]<br />
*Right-click and go to the GPIO definition location.<br />
:[[File:ESP32-C6-DEV-KIT-N8-39.png]]<br />
*The actual GPIOs are GPIO18, GPIO19.<br />
:[[File:ESP32-C6-DEV-KIT-N8-40.png]]<br />
<br />
====RGB====<br />
Official example path: get-started -> blink.<br/><br />
Sample effect: onboard RGB beads blink at 1-second intervals.<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Follow the tutorial above to create the official example blink. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Modify the COM port and the driver object, click compile, and burn to run the demo.<br />
:[[File:ESP32-C6-DEV-KIT-N8-41.png]]<br />
<br />
====UART====<br />
Official example path: peripherals -> uart-> uart_async_rxtxtasks<br/><br />
Example effect: shorting GPIO4 and GPIO5 to send/receive UART data.<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Hardware Connection'''=====<br />
<br />
:{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;width:350px;" |ESP32-C6<br />
|style="background:green; color:white;text-align:center;" |ESP32-C6 (the same one)<br />
|-<br />
|style="text-align:center;" |GPIO4<br />
|style="text-align:center;" |GPIO5<br />
|}<br />
<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Create the official example uart_async_rxtxtasks according to the tutorial above. ([[#Official_Demo_Usage_GUIDE|Create Example]]).<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Modify the COM port and driver object, click compile, and burn to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-77.png]]<br />
*Hardware connection according to the GPIO used.<br />
[[File:ESP32-C6-DEV-KIT-N8-78.png]]<br/><br />
*You can go to the definition file to see the actual GPIOs used (check '''GPIO_NUM_4''' -> Right click -> '''Go to Definition''').<br />
[[File:ESP32-C6-DEV-KIT-N8-79.png]]<br/><br />
<br />
====I2C====<br />
The official example path: peripherals -> lcd-> i2c_oled.<br/><br />
Example effect: turns on the [https://www.waveshare.com/0.96inch-oled-a.htm 0.96-inch OLED (A)] and displays a character. <br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Hardware Connection'''=====<br />
{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;width:350px;" |0.96inch OLED (A)<br />
|style="background:green; color:white;text-align:center;" |ESP32-C6<br />
|-<br />
|style="text-align:center;" |VCC<br />
|style="text-align:center;" |3V3<br />
|-<br />
|style="text-align:center;" |GND<br />
|style="text-align:center;" |GND<br />
|-<br />
|style="text-align:center;" |DIN<br />
|style="text-align:center;" |GPIO3<br />
|-<br />
|style="text-align:center;" |CLK<br />
|style="text-align:center;" |GPIO4<br />
|-<br />
|style="text-align:center;" |CS<br />
|style="text-align:center;" |GND<br />
|-<br />
|style="text-align:center;" |D/C<br />
|style="text-align:center;" |GND<br />
|-<br />
|style="text-align:center;" |RES<br />
|style="text-align:center;" |GPIO9<br />
|}<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Create the official example i2c_oled according to the tutorial above. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*Modify the demo to be compatible with [https://www.waveshare.com/0.96inch-oled-a.htm 0.96-inch OLED (A)].<br />
[[File:ESP32-C6-DEV-KIT-N8-31.png]]<br />
*Adapts to 0.96-inch OLED (A), defines RES pin as GPIO9.<br />
[[File:ESP32-C6-DEV-KIT-N8-32.png]]<br />
*Modify the COM port and the driver object (''' it is recommended to prioritize the use of USB corresponding to the COM port, can be viewed through the device manager'''), click on the compile and burn to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-35.png]]<br />
*The effect is as shown below:<br />
[[File:ESP32-C6-PICO-33.png]]<br />
*You can view the actual use of GPIO:<br />
[[File:ESP32-C6-DEV-KIT-N8-34.png]]<br />
<br />
====SPI====<br />
The path of the official example: peripherals -> spi_master-> lcd.<br/><br />
The example effect: dynamic displaying picture effect on the [https://www.waveshare.com/2.4inch-lcd-module.htm 2.4inch LCD Module].<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Hardware Connection'''=====<br />
{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;width:350px;" |2.4inch LCD Module<br />
|style="background:green; color:white;text-align:center;" |ESP32-C6<br />
|-<br />
|style="text-align:center;" |VCC<br />
|style="text-align:center;" |3V3<br />
|-<br />
|style="text-align:center;" |GND<br />
|style="text-align:center;" |GND<br />
|-<br />
|style="text-align:center;" |DIN<br />
|style="text-align:center;" |GPIO7<br />
|-<br />
|style="text-align:center;" |CLK<br />
|style="text-align:center;" |GPIO6<br />
|-<br />
|style="text-align:center;" |CS<br />
|style="text-align:center;" |GPIO0<br />
|-<br />
|style="text-align:center;" |D/C<br />
|style="text-align:center;" |GPIO1<br />
|-<br />
|style="text-align:center;" |RES<br />
|style="text-align:center;" |GPIO4<br />
|-<br />
|style="text-align:center;" |BL<br />
|style="text-align:center;" |GPIO5<br />
|}<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Right-click on the VScode icon and run VScode as administrator:<br />
[[File:ESP32-C6-DEV-KIT-N8-035.png]]<br />
*Create an official LCD example according to the tutorial above. ([[#Official Demo Usage GUIDE|Create Example]])<br />
Modify the demo to make it compatible with a [https://www.waveshare.com/2.4inch-lcd-module.htm 2.4inch LCD Module].<br/><br />
[[File:ESP32-C6-DEV-KIT-N8-36.png]]<br />
*Go to Declaration.<br />
[[File:ESP32-C6-DEV-KIT-N837.png]]<br />
*Currently using ESP32-C6, blocking other chip definitions.<br />
[[File: ESP32-C6-DEV-KIT-N8-036.png]]<br />
*And macro-define ESP32-C6, '''CONFIG_IDF_TARGET_ESP32C6'''.<br />
<pre><br />
//#define CONFIG_IDF_TARGET_ESP32 1<br />
#define CONFIG_IDF_TARGET_ESP32C6 1<br />
</pre><br />
[[File:ESP32-C6-DEV-KIT-N8-42.png]]<br />
*Modify D/C using IO.<br />
**Enter the 60th line of '''spi_master_example_main.c'''.<br />
[[File:ESP32-C6-43.png]]<br />
*Modify D/C to use IOs, select existing IOs (modify GPIO10 and GPIO9 to GPIO0 and GPIO1).<br />
[[File:ESP32-C6.png]]<br />
*Modify the backlight:<br />
[[File: ESP32-C6-DEV-KIT-N8-45.png]]<br />
*Modify it as '''gpio_set_level(PIN_NUM_BCKL, 1);'''<br />
[[File:ESP32-C6-DEV-KIT-N8-46.png]]<br />
*Modify the COM port and the driver object, click to compile, and flash to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-47.png]]<br />
*The effect is as shown below:<br />
[[File:ESP32-C6-48.png]]<br />
<br />
====Bluetooth====<br />
Official sample path: bluetooth -> bluedroid -> ble -> gatt_server.<br/><br />
Example effect: ESP32-C6 and cell phone Bluetooth debugging assistant for data transmission.<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Install the [https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/ESP32-C6_TO_BLEAssist.ZIP Bluetooth debugging assistant] on your phone.<br />
*Follow the tutorial above to create the official example gatt_server. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Bluetooth name and UUID, Bluetooth name is '''ESP_GATTS_DEMO'''.<br />
[[File:ESP32-C6-DEV-KIT-N8-50.png]]<br />
*Modify the COM port and the driver object, click on the compile, and burn to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-51.png]]<br />
*Connecting the ESP_GATTS_DEMO Bluetooth device on the phone.<br />
*The effect of a successful connection is shown below:<br />
[[File:ESP32-C6-DEV-KIT-N8-52.png]]<br />
*Based on the UUID value in the demo, select one of the two servers for upstream transmission.<br />
*The ESP32-C6 receives data:<br />
[[File:ESP32-C6-DEV-KIT-N8-54.png]]<br />
<br />
====WIFI====<br />
Official example path: wifi -> getting_started -> station.<br/><br />
Sample effect: ESP32-C6 connects to WIFI.<br/><br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation'''=====<br />
*Create the official example station according to the tutorial above. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*Modify the contents of the demo to connect to the required WiFi.<br />
*Go to the '''Kconfig.projbuild''' file.<br />
[[File:ESP32-C6-DEV-KIT-N8-60.png]]<br />
*Change the original '''WiFi SSID''' and '''WiFi Password''' to the WiFi information you want to connect to.<br />
[[File:ESP32-C6-DEV-KIT-N8-61.png]]<br />
*Modify the COM port and the driver object, click on the compile, and upload to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-62.png]]<br />
*You can check the value of '''CONFIG_ESP_WIFI_SSID'''.<br />
*Go to the '''station_ example_ main.c''' file.<br />
[[File:ESP32-C6-DEV-KIT-N8-63.png]]<br />
*Right-click to Go to Definition.<br />
[[File:ESP32-C6-DEV-KIT-N8-64.png]]<br />
*The previously set value can be seen as:<br />
[[File:ESP32-C6-DEV-KIT-N8-65.png]]<br />
<br />
====Zigbee====<br />
*Official example 1 path: Zigbee -> light_sample -> HA_on_off_switch.<br />
*Official example 2 path: Zigbee -> light_sample -> HA_on_off_light.<br />
*Example effect: use 2x ESP32-C6 boards, use the BOOT key of one ESP32-C6 board (upload HA_on_off_switch demo) to control the RGB LED ON/OFF on the other one.<br />
*'''Note: Please upload the HA_on_off_switch demo to one board first, and then flash the HA_on_off_light demo to the other board.'''<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation 1'''=====<br />
*Create the official example HA_on_off_switch according to the tutorial above. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Modify the COM port and the driver object, click compile, and burn to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-66.png]]<br />
<br />
=====&nbsp;&nbsp;&nbsp;&nbsp;'''Software Operation 2'''=====<br />
*Follow the tutorial above to create the official example HA_on_off_light. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo.<br />
*Modify the COM port and driver object, click compile and burn to run the demo ('''you need to wait for a moment for the two chips to establish a connection''').<br />
:[[File:ESP32-C6-DEV-KIT-N8-70.png]]<br />
*'''If the device remains unconnected''', it may be due to residual network information on the device, so you can erase the device information ([[#Erase Device Flash|Erase Tutorial]]) and reorganize the network.<br />
:[[File:ESP32-C6-DEV-KIT-N8-71.png]]<br />
<br />
====JTAG Debug====<br />
=====&nbsp;&nbsp;&nbsp;&nbsp; '''Software Operation'''=====<br />
*Create a debugging example, this example uses the official example hello_world. ([[#Official Demo Usage GUIDE|Create Example]])<br />
*Modify the '''launch.json''' file.<br />
[[File:ESP32-C6-DEV-KIT-N8-72.png]]<br />
*Input the following content:<br />
<pre><br />
{<br />
"version": "0.2.0",<br />
"configurations": [<br />
{<br />
"name": "GDB",<br />
"type": "cppdbg",<br />
"request": "launch",<br />
"MIMode": "gdb",<br />
"miDebuggerPath": "${command:espIdf.getXtensaGdb}",<br />
"program": "${workspaceFolder}/build/${command:espIdf.getProjectName}.elf",<br />
"windows": {<br />
"program": "${workspaceFolder}\\build\\${command:espIdf.getProjectName}.elf"<br />
},<br />
"cwd": "${workspaceFolder}",<br />
"environment": [{ "name": "PATH", "value": "${config:idf.customExtraPaths}" }],<br />
"setupCommands": [<br />
{ "text": "target remote :3333" },<br />
{ "text": "set remote hardware-watchpoint-limit 2"},<br />
{ "text": "mon reset halt" },<br />
{ "text": "thb app_main" },<br />
{ "text": "flushregs" }<br />
],<br />
"externalConsole": false,<br />
"logging": {<br />
"engineLogging": true<br />
}<br />
}<br />
]<br />
}<br />
</pre><br />
[[File:ESP32-C6-DEV-KIT-N8-73.png]]<br />
*The demo is compatible with ESP32-C6 and can be used without modifying the demo content.<br />
*Modify the COM port and the driver object ('''Please use the USB interface; the UART interface does not support JTAG debugging. The corresponding COM port can be checked through the Device Manager.'''), click compile, and flash to run the demo.<br />
[[File:ESP32-C6-DEV-KIT-N8-30.png]]<br />
*Press F1 and input:<br />
ESP-IDF:Device configuration<br />
[[File:ESP32-C6-DEV-KIT-N8-75.png]]<br />
*Select ''' OpenOcd Config Files'''.<br />
[[File:ESP32-C6-DEV-KIT-N8-76.png]]<br />
*Type '''board/esp32c6-builtin.cfg''' (if this is the default, just enter)<br />
board/esp32c6-builtin.cfg<br />
[[File:ESP32-C6-DEV-KIT-N8077.png]]<br />
*Stretch the width of the window until '''[OpenOCD Server]''' is displayed at the bottom.<br />
[[File:ESP32-C6-DEV-KIT-N8078.png]]<br />
*Click '''[OpenOCD Server]''' and select '''Start OpenOCD'''.<br />
[[File:ESP32-C6-DEV-KIT-N8079.png]]<br />
*Successfully opened as follows:<br />
[[File:ESP32-C6-DEV-KIT-N8-80.png]]<br />
*Go to the debug function and click Debug:<br />
[[File:ESP32-C6-DEV-KIT-N8-81.png]]<br />
*Successfully enter the debugging interface:<br />
[[File:ESP32-C6-DEV-KIT-N8-82.png]]<br />
<br />
===Erase Device Flash===<br />
*Unpack the software resource package ([https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/Flash_download_tool_3.9.5_0.zip Flash debugging software]).<br />
*Open '''flash_download_tool_3.9.5.exe''' software, select ESP32-C6 and UART.<br />
[[File:ESP32-C6-DEV-KIT-N8-83.png]]<br />
*Select the UART port number, and click '''Start''' (not select any bin file).<br />
[[File:ESP32-C6-DEV-KIT-N8-84.png]]<br />
*After programming, click on "ERASE".<br />
[[File:ESP32-C6-DEV-KIT-N8-85.png]]<br />
*Waiting for Erase to Finish.<br />
[[File:ESP32-C6-DEV-KIT-N8-86.png]]<br />
</div><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
=Working with Arduino=<br />
<div class="mw-collapsible-content"><br />
'''Please note that Arduino 3.0.0-alpha is based on ESP-IDF v5.1, which is quite different from the previous ESP-IDF V4.X. The original program may need to be adjusted after the following operations.'''<br />
==Environment Set-up==<br />
*Install [https://www.arduino.cc/en/software/ Arduino IDE].<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino01.png]]<br />
*Enter Arduino IDE after installation.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino02.png]]<br />
*Enter Preferences.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino03.png]]<br />
*Add JSON link:<br />
https://espressif.github.io/arduino-esp32/package_esp32_dev_index.json<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino04.png]]<br/>[[File: ESP32-C6-DEV-KIT-N8-Arduino05.png]]<br />
*Modify the project file folder as '''C:\Users\Waveshare\AppData\Local\Arduino15\packages''' (Waveshare is the username).<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino06.png]]<br />
*Enter the development board manager, search for "esp32", select version 3.0.0-alpha3 under "esp32 by Espressif Systems" below, and click to install. (If installation fails, try using a mobile hotspot.)<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino07.png]]<br/>[[File: ESP32-C6-DEV-KIT-N8-Arduino08.png]]<br />
*Restart the Arduino IDE after installation, and then you can use it now.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino09.png]]<br />
===If the installation fails===<br />
*Failed to install version 3.0.0-alpha3:<br />
[[File:ESP32-C6-DEV-KIT-N8-install.png]]<br />
*Download [https://drive.google.com/file/d/19gMF3WHr4OreN26u2jYtGXKotRn8K16l/view?usp=sharing the resource file]:<br />
[[File:ESP32-C6-DEV-KIT-N8-install02.png]]<br />
*Click on the path "c:\Users\Waveshare\AppData\Local\Arduino15\packages" (where Waveshare is the user name of the computer, and you need to turn on Show Hidden Files).<br />
[[File:ESP32-C6-DEV-KIT-N8-install03.png]]<br />
*Unzop the downloaded files to the packages file folder:<br />
[[File:ESP32-C6-DEV-KIT-N8-install04.png]]<br />
*Install it again:<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino07.png]]<br />
*Restart the Arduino IDE after installation and you're ready to go!<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino09.png]]<br />
<br />
==Create Example==<br />
*After changing the project folder above to '''c:\Users\Waveshare\AppData\Local\Arduino15\packages ("Waveshare" is the computer username)''', you can create demos using the examples in the project folder under the files.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino10.png]]<br />
*The following is the RGB flashing example (File -> Sketchbook -> esp32 -> hardware -> esp32 -> 3.0.0-alpha3 -> libraries -> ESP32 -> examples -> BlinkRGB under GPIO).<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino11.png]]<br />
*Select the development board and port.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino12.png]]<br />
*Search esp32c6, select ESP32C6 Dev Module, and the ports to download.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino13.png]]<br />
*After selecting, click to upload and Arduino IDE will start to compile and flash the demo.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino14.png]]<br />
*After uploading, you can see the effect on the development board.<br />
[[File: ESP32-C6-DEV-KIT-N8-Arduino15.png]]<br />
<br />
==Add EXIO Demo In Arduino IDE ==<br />
* Take the modification of the official demo BlinkRGB as an example:<br />
* According to the tutorial [[#Official Demo Usage GUIDE|Create Demo]] above, create BlinkRGB demo.<br />
* For convenience, you can store it at other paths: <br />
[[File:ESP32-C6 TO Arduino EXIO 1.png]]<br />
[[File:ESP32-C6 TO Arduino EXIO 2.png]]<br />
* Download [https://files.waveshare.com/wiki/ESP32-C6-Pico/ArduinoIDE_TCA9554.zip EXIO Control Demo].<br />
[[File:ESP32-C6 TO Arduino EXIO 4.png]]<br />
*Enter the directory where you just saved it, and copy the EXIO control demo to the BlinkRGB project folder.<br />
[[File:ESP32-C6 TO Arduino EXIO 3.png]]<br />
[[File:ESP32-C6 TO Arduino EXIO 5.png]]<br />
* Use the TCA9554PWR file in BlinkRGB.ino:<br />
[[File:ESP32-C6 TO Arduino EXIO 6.png]]<br />
* Currently, you can use the EXIO control functions to operate EXIO1 through EXIO7.<br />
* Below, adding the following code in the setup() and loop() functions will sequentially set EXIO1 to EXIO7 to a high level and print the real-time level status of EXIO3:<br />
TCA9554PWR_Init(0x00);<br />
<br />
uint8_t count = 0;<br />
while(1)<br />
{<br />
Set_EXIOS(0x01<<count); // Set 7 EXIO loops to output high levels<br />
count++; <br />
if(count == 7)<br />
count = 0; <br />
delay(1000); <br />
uint8_t State = Read_EXIO(TCA9554_EXIO3); // Read EXIO3's input level <br />
printf("EXIO3: %d\r\n",State); <br />
} <br />
[[File:ESP32-C6 TO Arduino EXIO 7.png]]<br />
* Running effect as shown below:<br />
[[File:ESP32-C6 TO Arduino EXIO 8.png]]<br />
</div><br />
<br />
=Resource=<br />
==Software==<br />
===Compile===<br />
* [https://code.visualstudio.com/download VScode] <br />
* [https://www.arduino.cc/en/software Arduino IDE]<br />
===UART===<br />
*[https://files.waveshare.com/wiki/LC29H(XX)-GPS-RTK-HAT/Sscom5.13.1.zip SSCOM5.13.1]<br />
===Flash===<br />
*[https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/Flash_download_tool_3.9.5_0.zip Flash]<br />
===Bluetooth===<br />
*[https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/ESP32-C6_TO_BLEAssist.ZIP Bluetooth debugging assistant]<br />
==Schematic==<br />
*[https://files.waveshare.com/wiki/ESP32-C6-Pico/ESP32-C6-Pico-Sch.pdf Schematic diagram]<br />
==Datasheet==<br />
*[https://files.waveshare.com/wiki/ESP32-C6-Pico/ESP32-C6-MINI-1_Datasheet.pdf ESP32-C6-MINI-1 Datasheet]<br />
*[https://files.waveshare.com/wiki/ESP32-C6-DEV-KIT-N8/ESP32-C6_Technical_Reference_Manual.pdf ESP32-C6 Technical Reference Manual]<br />
*[https://files.waveshare.com/wiki/ESP32-C6-Pico/ESP32-C6_Series_Datasheet.pdf ESP32-C6 Series Datasheet]<br />
<br />
==Official Datasheet==<br />
*[https://docs.espressif.com/projects/esp-idf/en/latest/esp32c6/index.html ESP-IDF Datasheet]<br />
<br />
=FAQ=<br />
{{FAQ|After the module downloads the demo and re-downloads it, sometimes it fails to connect to the serial port, or the flashing fails?<br />
|<br />
You can long-press the BOOT button, simultaneously press the RESET button, then release the RESET button, and finally release the BOOT button. This will put the module into download mode and can resolve most download issues.<br />
||}}<br />
{{FAQ|No ESP option below when setting up an environment or building a project?<br />
|<br />
In VSCode, click the shortcut '''F1''', and search for '''Espressif IDF''', you will find that it is designated as an untrusted extension, set it as trusted.<br />
||}}<br />
{{FAQ|Switch to the same ESP model and encounter issues with program burning and program execution?<br />
|<br />
Please select the COM port and driver object again after switching ESP, then compile and burn.<br />
||}}<br />
{{FAQ|After powering up the module, the recognized serial devices and USB ports keep resetting and restarting?<br />
|<br />
Check whether the power supply voltage for the USB port is less than 5V, in general, if it is 4.9V or more, the module's two USB ports can be used normally. If it is lower than 4.9V, the power supply may be insufficient and the USB port may disconnect. In this case, you should replace it with a USB port with sufficient voltage.<br />
||}}<br />
<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/TestTest2024-03-13T03:03:30Z<p>Waveshare-admin: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|img=[[File:7.9inchLCD.jpg|400px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/7.9inch-dsi-lcd.htm}}]]<br />
|category=[[:Category:OLEDs / LCDs|OLEDs / LCDs]], [[:Category:LCD|LCD]], [[:Category:Raspberry Pi LCD|Raspberry Pi LCD]]<br />
|caption=400 x 1280, RPI, DSI<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop/7.9inch-DSI-LCD.htm 官网]<br />
|website_en=[ Website]<br />
|interface1=DSI<br />
}}<br />
<br />
=Overview=<br />
==Feature==<br />
*7.9inch IPS display with capacitive touch panel, hardware resolution is 400 x 1280.<br />
*Capacitive touch, supports up to 5-point touch.<br />
*Toughened glass capacitive touch panel, 6H hardness.<br />
*DSI interface, refresh rate up to 60Hz.<br />
*Working with Raspberry Pi, we provide the driver for Raspberry Pi OS.<br />
*Brightness is adjustable by software.<br />
*Support Pi 4B/3B+/3A+, CM3+/4 ,must be used with [https://www.waveshare.com/dsi-cable-15cm.htm adapter cable].<br />
<br />
=Hardware Connection=<br />
==Working with Pi4B/3B+/3B/3A+==<br />
1. Use a 15PIN FPC cable to connect the DSI interface of the display screen to the DSI interface of the Raspberry Pi board.<br />
2. Install the Raspberry Pi on the display board with the back facing down, and connect the 5V power supply and I2C communication through the 4PIN.<br />
The final connection is shown below:<br/><br />
[[File:7.9inch DSI LCD2.png]]<br/><br />
==Working with Pi5/CM4/CM3+/CM3==<br />
1. Use the "DSI-Cable-12cm" cable to connect the display's DSI interface to the 22-pin DSI1 interface on the Raspberry Pi board.<br><br />
2. Use the "PH2.0 4PIN connection cable" to connect the display screen's 4-pin header to the 5V and GND pins on the Raspberry Pi board. (By default, it uses the I2C0 of the DSI interface. If using the I2C1 mode, you will need to additionally connect the SDA and SCL pins.)<br />
<br />
<br />
<br />
=Resource=<br />
==Software==<br />
*[https://files.waveshare.com/upload/d/d7/Panasonic_SDFormatter.zip Panasonic_SDFormatter]<br />
*[https://files.waveshare.com/upload/7/76/Win32DiskImager.zip Win32DiskImager]<br />
*[https://files.waveshare.com/upload/5/56/Putty.zip putty]<br />
==Pre-installed images==<br />
*[https://drive.google.com/file/d/1p7Ke8DKQ8Ru9muEfswhfgCT6HF6sedta/view?usp=sharing 7.9inch DSI LCD_220906_32_bullseye]<br />
*[https://drive.google.com/file/d/1JmA8A_h7qf5HROfE6fVvRA8H0eCe7daX/view Waveshare DSI LCD - Pi4 pre-install image]<br />
*[https://drive.google.com/file/d/1lJLzrlXGnWi1E_Yby3tWqKlB50TGtJpn/view?usp=sharing Waveshare DSI LCD - Pi3 pre-install image]<br />
<br />
==3D Drawing==<br />
*[https://files.waveshare.com/upload/b/bf/79inch-lcd-dsi-asm-20221112.zip 7.9inch DSI LCD 3D Drawing]<br />
=FAQ=<br />
{{FAQ|How to replace the Raspberry Pi boot logo image?<br />
|<br />
Just replace the customized image with one from this directory /usr/share/plymouth/themes/pix/splash.png.<br />
||}}<br />
<br />
=Support=<br />
{{Servicebox1}}</div>Waveshare-adminhttps://www.waveshare.com/wiki/4.3inch_DSI4.3inch DSI2024-03-13T02:51:29Z<p>Waveshare-admin: </p>
<hr />
<div>TEST<br />
TEST</div>Waveshare-adminhttps://www.waveshare.com/wiki/DSIDSI2024-03-13T02:50:53Z<p>Waveshare-admin: Created page with "TSET"</p>
<hr />
<div>TSET</div>Waveshare-adminhttps://www.waveshare.com/wiki/PCIe_TO_USB_3.2_Gen1_HAT%2BPCIe TO USB 3.2 Gen1 HAT+2024-03-13T02:31:16Z<p>Eng52: </p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|img=[[File:PCIe TO USB 3.2 Gen1 HAT+.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/pcie-to-usb-3.2-gen1-hat-plus.htm}}]]<br />
|caption=PCIe TO USB<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
<br />
=Introduction=<br />
'''PCIe TO USB 3.2 Gen1 HAT+ for Raspberry Pi 5, onboard 4x USB connectors, driver-free, plug-and-play, 4x high-speed USB ports, HAT+ standard.'''<br /><br />
<br />
==Features==<br />
*PCI-E×1 Gen2 mode.<br />
*Only support PI5B.<br />
*Equipped with VL805 original high-performance main control chip.<br />
*Reserved airflow vent for cooling fan, better cooling effect for Raspberry Pi 5, more stable performance.<br />
*Support USB power control.<br />
<br />
==Note==<br />
*PCIE interface is not enabled on the Raspberry Pi by default.<br />
<br />
=User Guide=<br />
==Hardware Connection==<br />
Pay attention to the direction of the wires and connect as shown in the figure: <br><br />
[[file:PCIe-TO-USB-3.2-Gen1-HAT-Plus-details-3.png | 600px]]<br><br />
<br />
==How to Use==<br />
1: Enable PCIE interface:<br />
PCIE interface is not enabled on the Raspberry Pi 5 by default, you can add the following content at /boot/firmware/config.txt:<br />
dtparam=pciex1<br />
2: PCIE gen2 is the default setting, if you want to enable PCIE gen3, you need to add the following content at /boot/firmware/config.txt:<br />
dtparam=pciex1_gen=3<br />
#Please note that the module only supports gen2, so it is the same to set Gen3 or Gen2 on the PI5, and the speed will not be improved. <br />
3: Reboot PI5 after modification, and the device can be recognized. <br />
As shown below, the VL805 is recognized as our device, and the other PI5 is the RPI chip.<br />
[[file:PCIe-TO-USB-3.2-Gen1-HAT-Plus-1-1.png]]<br />
4: Execute "lsusb" to the USB device that has been recognized.<br />
[[file:PCIe-TO-USB-3.2-Gen1-HAT-Plus-1-2.png]]<br />
<br />
==USB Power Supply Control==<br />
USB port No. 1~4:<br><br />
[[file:PCIe-TO-USB-3.2-Gen1-HAT-Plus-1-4.jpg|700px]]<br />
===uhubctl Tool Control (Default) ===<br />
1: Install uhubctl tool:<br />
sudo apt-get install uhubctl<br />
2: How to use:<br />
#Close all USB power supply off<br />
sudo uhubctl -l 1-1 -a off<br />
#Unspecified port, disable all USB power under the bus <br />
#Open USB power supply<br />
sudo uhubctl -l 1-1 -p 1 -a on<br />
sudo uhubctl -l 1-1 -p 2 -a on<br />
sudo uhubctl -l 1-1 -p 3 -a on<br />
sudo uhubctl -l 1-1 -p 4 -a on<br />
#-p defines port number <br />
#-a defines the device status <br />
#-l(Lowercase L) Specifies the USB bus, which can be viewed by lsub -t<br />
#The port number should correspond to the USB 2.0 port number, and if there are no other USB devices connected, it should default to the corresponding command<br />
#Close a USB power supply <br />
sudo uhubctl -l 1-1 -p 1 -a off<br />
sudo uhubctl -l 1-1 -p 2 -a off<br />
sudo uhubctl -l 1-1 -p 3 -a off<br />
sudo uhubctl -l 1-1 -p 4 -a off<br />
#Note that the first time you use the command to close a single USB port may not be able to operate need to close all before you can!<br />
[[file:PCIe-TO-USB-3.2-Gen1-HAT-Plus-1-6.png]]<br />
<br />
===GPIO Control===<br />
GPIO control is not supported by default, and you need to solder the 0R as shown below:<br><br />
[[file : PCIe-TO-USB-3.2-Gen1-HAT-Plus-1-5.jpg|700px]]<br><br />
USB controls the corresponding GPIO: <br><br />
USB1 GPIO28<br><br />
USB2 GPIO27<br><br />
USB3 GPIO26<br><br />
USB4 GPIO29<br><br />
<br />
Note: When using GPIO control, do not use uhubctl, as it may interfere with GPIO control.<br />
<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)5.79inch e-Paper Module (B)2024-03-12T02:50:55Z<p>Eng52: Created page with "{{#tweekihide:firstHeading|sidebar-left|sidebar-right}}__NOTOC__ <div style="background-color: #343434;text-align: center;color: white;padding: 20px;margin: 8px;"> =5.79inch..."</p>
<hr />
<div>{{#tweekihide:firstHeading|sidebar-left|sidebar-right}}__NOTOC__ <br />
<br />
<div style="background-color: #343434;text-align: center;color: white;padding: 20px;margin: 8px;"><br />
=5.79inch e-Paper Module (B) Manual=<br />
</div><br />
<p></p><br />
{{ContentGrid|grid-gap=25px<br />
|content =<br />
{{StudyCard<br />
|img=[[File:E-Paper_Introduction_4.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual#Overview]]<br />
|heading = Introduction<br />
|content = Parameters, principles, and precautions<br />
}}<br />
{{StudyCard<br />
|img=[[File:Rapberry Pi.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual#Working_With_Raspberry_Pi]]<br />
|heading = Working with Raspberry Pi<br />
|content = User guides for the development demo of C language, Python <br />
}}<br />
{{StudyCard<br />
|img=[[File:Arduino00.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual#Working_With_Arduino]]<br />
|heading = Working with the Arduino<br />
|content = User guides for the development demo based on Arduino UNO R3<br />
}}<br />
{{StudyCard<br />
|img=[[File:Jetson Arduino00.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual#Working_With_Jetson_Nano]]<br />
|heading = Working with the Jetson Nano<br />
|content = User guides for the development demo of C language, Python <br />
}}<br />
{{StudyCard<br />
|img=[[File:Sunrise X3 Pi.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual#Working_With_Sunrise_X3_Pi]]<br />
|heading = Working with Sunrise X3 Pi<br />
|content = User guides for the development demo based on Python<br />
}}<br />
{{StudyCard<br />
|img=[[File:STM321.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual#Working_With_STM32]]<br />
|heading = Working with the STM32<br />
|content = User guides for the development demo based on STM32CubeMX<br />
}}<br />
{{StudyCard<br />
|img=[[File:Resource0.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual#Resource]]<br />
|heading = Resources<br />
|content = Documentation, procedures and data sheets, etc<br />
}}<br />
{{StudyCard<br />
|img=[[File:FAQ01.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual#FAQ]]<br />
|heading = FAQ<br />
|content = e-Paper frequently asked questions<br />
}}<br />
{{StudyCard<br />
|img=[[File:support00.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual#Support]]<br />
|heading = Support<br />
|content = Technical support<br />
}}<br />
}}</div>Eng52https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_(B)_Manual5.79inch e-Paper Module (B) Manual2024-03-11T09:22:43Z<p>Eng52: /* FAQ */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item<br />
|name=5.79inch e-Paper (B) <br />
|name2=5.79inch e-Paper Module (B) <br />
|img=[[File:5.79inch e-Paper Module (B).jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/5.79inch-e-paper-b.htm}}]]<br />
|img2=[[File:5.79inch e-Paper Module (B)-2.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/5.79inch-e-paper-module-b.htm}}]]<br />
|caption2=Black, White<br/>296 × 128<br/>Raspberry, Arduino, STM32, SPI<br/>Support Partial Refresh<br />
|category=[[:Category:OLEDs / LCDs|OLEDs / LCDs]], [[:Category:LCD|LCD]]<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[http://www.waveshare.net/shop 官方中文站点]<br />
|website_en=[http://www.waveshare.com Waveshare website]<br />
|interface1=SPI<br />
|related=<br />
{{Product List|OLEDs / LCDs/e-Paper}}<br />
}}<br />
=Overview=<br />
==Parameters==<br />
{|border=1; style="border:#dddddd; width:600px; line-height:200%" align="center"<br />
|- style="background:white; color:black;" align="center"<br />
| Size|| 5.79-inch<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Driver board dimensions|| 152.50mm × 58.50mm<br />
|- style="background:white; color:black;" align="center"<br />
| Outline dimensions (raw panel) || 150.92mm × 56.94mm × 1.00mm<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Display Size|| 139.00mm × 47.74mm<br />
|- style="background:white; color:black;" align="center"<br />
| Operating voltage || 3.3V / 5V (IO level voltage should be the same as the supply voltage)<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Communication interface|| SPI<br />
|- style="background:white; color:black;" align="center"<br />
| Dot pitch|| 0.1755 × 0.1755<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Resolution || 792 × 272<br />
|- style="background:white; color:black;" align="center"<br />
| Display color|| Red, Black. White <br />
|- style="background:#efefef; color:black;" align="center"<br />
| Grey scale || 24<br />
|- style="background:white; color:black;" align="center"<br />
| Refresh time || 3.5s<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Refresh power|| < 50mW(typ.)<br />
|- style="background:white; color:black;" align="center"<br />
| Standby Current|| < 0.01uA (almost 0)<br />
|}<br />
*Refresh time: The refresh time is the experimental results, the actual refresh time will have errors, and the actual effect shall prevail. There will be a flickering effect during the global refresh process, this is a normal phenomenon.<br />
*Power consumption: The power consumption data is the experimental results. The actual power consumption will have a certain error due to the existence of the driver board and the actual use situation. The actual effect shall prevail.<br />
<br />
{{e-Paper_SPI_Communication}}<br />
{{e-Paper_Principle}}<br />
<br />
==Program Principle==<br />
*We define the pixels in a monochrome picture, 0 is black and 1 is white.<br /><br />
**White:□: Bit 1<br /><br />
**Black:■: Bit 0<br /><br />
*The dot in the figure is called a pixel. As we know, 1 and 0 are used to define the color, therefore we can use one bit to define the color of one pixel, and 1 byte = 8pixels<br /><br />
*For example, If we set the first 8 pixels to black and the last 8 pixels to white, we show it by codes, they will be 16-bit as below:<br /><br />
[[file:e-paper_hardware_work_1.png]]<br /><br />
*For a computer, its data storage method is high-order first, low-order later, and a byte has only 8 bits, so there will be a little change:<br /><br />
[[file:e-paper_hardware_work_2.png]]<br /><br />
*In this way, only 2 bytes are needed to represent 16 pixels.<br />
*For 4.2inch e-paper B, which is red, black, and white, we need to split the picture into two pictures, one black and white picture, one red and white picture, because one register controls black and white display during transmission and one register control Red and white display. 1 byte in the black and white part controls 8 pixels, 1 byte in the red and white part controls 8 pixels<br />
*Suppose there are 8 pixels, the front 4 are red, and the back 4 are black. Then you need to split them into a black and white picture, a red and white picture, these two pictures are 8 pixels, but in the front of the black and white picture The four pixels are white, and the last 4 pixels are black, while the first 4 pixels of the red and white picture are red, and the last four pixels are white.<br />
[[File:2.13inch-epPaer-B-pixels.png]]<br />
*If we stipulate that white is stored as 1, and red or black is stored as 0, then we have the following representation:<br />
[[File:2.13inch-epPaer-B-pixels-2.png]]<br />
*And 1 byte of the black and white part controls 8 pixels, and 1 byte of the red and white part controls 8 pixels, then it can be expressed as follows:<br />
[[File:2.13inch-epPaer-B-pixels-3.png]]<br />
==Dual IC Programming Analysis ==<br />
5.79-inch e-Paper Module (B) is an e-Paper controlled by dual IC. Before using, you need to learn more except the above programming principle.<br /><br />
[[File:5.79inch_e-Paper_Module_bc-1.png]]<br /><br />
*As shown in the picture above, each IC controls half of the screen. <br />
*Since it is evenly divided, the x-axis will discard the lower four bits of the 50th byte (396/8=49.5).<br />
*The orientation of the screens controlled by the two ICs is mirrored, so mirror configuration is required during initialization. This means one IC counts from 0 to 396, while the other IC counts from 396 to 0.<br />
*All control pins are shared, with registers used to differentiate between the control of the two ICs.<br />
**Registers such as controlling refresh, sleep, VGH voltage, and VGL voltage are shared. <br />
**Control registers for direction, starting position, display area, etc., are separated at 0x80.<br />
***The register addresses for the M controller range from 0x00 to 0x79.<br />
*** The register addresses for the S controller are the M controller's register addresses + 0x80.<br />
{{E-paper-precautions-color}}<br />
{{5.79inch e-Paper Module (B) RPI}}<br />
{{5.79inch e-Paper Module (B) Arduino}}<br />
{{5.79inch e-Paper Module (B) Jetson Nano}}<br />
{{5.79inch e-Paper Module (B) Sunrise}}<br />
{{5.79inch e-Paper Module (B)STM32}}<br />
=Resource=<br />
==Document==<br />
*[https://files.waveshare.com/wiki/5.79inch-e-Paper-Module-(B)/5.79-inch-e-Paper-b-user-manual.pdf User Manual]<br />
*[https://files.waveshare.com/wiki/5.79inch-e-Paper-Module-(B)/5.79inch_e-Paper_Module.pdf Schematics]<br />
==Demo==<br />
*[https://files.waveshare.com/upload/7/71/E-Paper_code.zip Demo (zip)]<br />
*[https://github.com/waveshareteam/e-Paper Github]<br />
==Development Resources==<br />
*[https://www.waveshare.com/wiki/E-Paper_Floyd-Steinberg E-Paper Floyd-Steinberg]<br />
*[https://files.waveshare.com/upload/3/36/Image2Lcd.7z Image2Lcd.7z]<br />
*[https://www.waveshare.com/wiki/Image2Lcd_Image_Modulo Image2Lcd Image Modulo]<br />
*[https://files.waveshare.com/upload/c/c6/Zimo221.7z Zimo221.7z ]<br />
*[https://www.waveshare.com/wiki/E-Paper_API_Analysis E-Paper_API_Analysis]<br />
<br />
=FAQ=<br />
{{e-paper Hat color FAQ}}<br />
<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/5.79inch_e-Paper_Module5.79inch e-Paper Module2024-03-11T08:55:42Z<p>Eng52: /* 5.79inch e-Paper Module */</p>
<hr />
<div>{{#tweekihide:firstHeading|sidebar-left|sidebar-right}}__NOTOC__ <br />
<br />
<div style="background-color: #343434;text-align: center;color: white;padding: 20px;margin: 8px;"><br />
=5.79inch e-Paper Module Manual=<br />
</div><br />
<p></p><br />
{{ContentGrid|grid-gap=25px<br />
|content =<br />
{{StudyCard<br />
|img=[[File:E-Paper_Introduction_4.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual#Overview]]<br />
|heading = Introduction<br />
|content = Parameters, principles, and precautions<br />
}}<br />
{{StudyCard<br />
|img=[[File:Rapberry Pi.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual#Working_With_Raspberry_Pi]]<br />
|heading = Working with Raspberry Pi<br />
|content = User guides for the development demo of C language, Python <br />
}}<br />
{{StudyCard<br />
|img=[[File:Arduino00.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual#Working_With_Arduino]]<br />
|heading = Working with the Arduino<br />
|content = User guides for the development demo based on Arduino UNO R3<br />
}}<br />
{{StudyCard<br />
|img=[[File:Jetson Arduino00.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual#Working_With_Jetson_Nano]]<br />
|heading = Working with the Jetson Nano<br />
|content = User guides for the development demo of C language, Python <br />
}}<br />
{{StudyCard<br />
|img=[[File:Sunrise X3 Pi.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual#Working_With_Sunrise_X3_Pi]]<br />
|heading = Working with Sunrise X3 Pi<br />
|content = User guides for the development demo based on Python<br />
}}<br />
{{StudyCard<br />
|img=[[File:STM321.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual#Working_With_STM32]]<br />
|heading = Working with the STM32<br />
|content = User guides for the development demo based on STM32CubeMX<br />
}}<br />
{{StudyCard<br />
|img=[[File:Resource0.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual#Resource]]<br />
|heading = Resources<br />
|content = Documentation, procedures and data sheets, etc<br />
}}<br />
{{StudyCard<br />
|img=[[File:FAQ01.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual#FAQ]]<br />
|heading = FAQ<br />
|content = e-Paper frequently asked questions<br />
}}<br />
{{StudyCard<br />
|img=[[File:support00.jpg|121px|link=https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual#Support]]<br />
|heading = Support<br />
|content = Technical support<br />
}}<br />
}}</div>Eng52https://www.waveshare.com/wiki/5.79inch_e-Paper_Module_Manual5.79inch e-Paper Module Manual2024-03-11T07:56:21Z<p>Eng52: /* Parameters */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item<br />
|name=5.79inch e-Paper <br />
|name2=5.79inch e-Paper Module<br />
|img=[[File:5.79inch e-Paper Module.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/5.79inch-e-Paper.htm}}]]<br />
|img2=[[File:5.79inch e-Paper Module-2.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=http://www.waveshare.com/5.79inch-e-paper-module.htm}}]]<br />
|caption2=Black, White<br/>296 × 128<br/>Raspberry, Arduino, STM32, SPI<br/>Support Partial Refresh<br />
|category=[[:Category:OLEDs / LCDs|OLEDs / LCDs]], [[:Category:LCD|LCD]]<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[http://www.waveshare.net/shop 官方中文站点]<br />
|website_en=[http://www.waveshare.com Waveshare website]<br />
|interface1=SPI<br />
|related=<br />
{{Product List|OLEDs / LCDs/e-Paper}}<br />
}}<br />
=Overview=<br />
==Parameters==<br />
{|border=1; style="border:#dddddd; width:600px; line-height:200%" align="center"<br />
|- style="background:white; color:black;" align="center"<br />
| Size|| 5.79-inch<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Driver board dimensions|| 152.50mm × 58.50mm<br />
|- style="background:white; color:black;" align="center"<br />
| Outline dimensions (raw panel) || 150.92mm × 56.94mm × 1.00mm<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Display Size|| 139.00mm × 47.74mm<br />
|- style="background:white; color:black;" align="center"<br />
| Operating voltage || 3.3V / 5V (IO level voltage should be the same as the supply voltage)<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Communication interface|| SPI<br />
|- style="background:white; color:black;" align="center"<br />
| Dot pitch|| 0.1755 × 0.1755<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Resolution || 792 × 272<br />
|- style="background:white; color:black;" align="center"<br />
| Display color|| Black, White <br />
|- style="background:#efefef; color:black;" align="center"<br />
| Grey scale || 4<br />
|- style="background:white; color:black;" align="center"<br />
| Refresh time || 3.5s<br />
|- style="background:#efefef; color:black;" align="center"<br />
| Refresh power|| < 50mW(typ.)<br />
|- style="background:white; color:black;" align="center"<br />
| Standby current|| < 0.01uA(almost 0)<br />
|}<br />
*Refresh time: The refresh time is the experimental results, the actual refresh time will have errors, and the actual effect shall prevail. There will be a flickering effect during the global refresh process, this is a normal phenomenon.<br />
*Power consumption: The power consumption data is the experimental results. The actual power consumption will have a certain error due to the existence of the driver board and the actual use situation. The actual effect shall prevail.<br />
<br />
{{e-Paper_SPI_Communication}}<br />
{{e-Paper_Principle}}<br />
<br />
==Program Principle==<br />
*We define the pixels in a monochrome picture, 0 is black and 1 is white.<br /><br />
**White:□: Bit 1<br /><br />
**Black:■: Bit 0<br /><br />
*The dot in the figure is called a pixel. As we know, 1 and 0 are used to define the color, therefore we can use one bit to define the color of one pixel, and 1 byte = 8pixels<br /><br />
*For example, If we set the first 8 pixels to black and the last 8 pixels to white, we show it by codes, they will be 16-bit as below:<br /><br />
[[file:e-paper_hardware_work_1.png]]<br /><br />
*For a computer, its data storage method is high-order first, low-order later, and a byte has only 8 bits, so there will be a little change:<br /><br />
[[file:e-paper_hardware_work_2.png]]<br /><br />
*In this way, only 2 bytes are needed to represent 16 pixels.<br />
==Dual IC Programming Analysis ==<br />
5.79-inch e-Paper is an e-Paper controlled by dual IC. Before using, you need to learn more except the above programming principle.<br /><br />
[[File:5.79inch_e-Paper_Module_bc-1.png]]<br /><br />
*As shown in the picture above, each IC controls half of the screen. <br />
*Since it is evenly divided, the x-axis will discard the lower four bits of the 50th byte (396/8=49.5).<br />
*The orientation of the screens controlled by the two ICs is mirrored, so mirror configuration is required during initialization. This means one IC counts from 0 to 396, while the other IC counts from 396 to 0.<br />
*All control pins are shared, with registers used to differentiate between the control of the two ICs.<br />
**Registers such as controlling refresh, sleep, VGH voltage, and VGL voltage are shared. <br />
**Control registers for direction, starting position, display area, etc., are separated at 0x80.<br />
***The register addresses for the M controller range from 0x00 to 0x79.<br />
*** The register addresses for the S controller are the M controller's register addresses + 0x80.<br />
{{e-paper-precautions_mono}}<br />
{{5.79inch e-Paper Module RPI}}<br />
{{5.79inch e-Paper Module Arduino}}<br />
{{5.79inch e-Paper Module Jetson}}<br />
{{5.79inch e-Paper Module Sunrise}}<br />
{{5.79inch e-Paper Module STM}}<br />
=Resource=<br />
==Document==<br />
*[https://files.waveshare.com/wiki/5.79inch-e-Paper-Module/5.79-inch-e-Paper-user-manual.pdf User Manual]<br />
*[https://files.waveshare.com/wiki/5.79inch-e-Paper-Module/5.79inch_e-Paper_Module.pdf Schematic]<br />
==Demo==<br />
*[https://files.waveshare.com/upload/7/71/E-Paper_code.zip Demo (zip)]<br />
*[https://github.com/waveshareteam/e-Paper Github]<br />
==Development Resources==<br />
*[https://www.waveshare.com/wiki/E-Paper_Floyd-Steinberg E-Paper Floyd-Steinberg]<br />
*[https://files.waveshare.com/upload/3/36/Image2Lcd.7z Image2Lcd.7z]<br />
*[https://www.waveshare.com/wiki/Image2Lcd_Image_Modulo Image2Lcd Image Modulo]<br />
*[https://files.waveshare.com/upload/c/c6/Zimo221.7z Zimo221.7z ]<br />
*[https://www.waveshare.com/wiki/E-Paper_API_Analysis E-Paper_API_Analysis]<br />
=FAQ=<br />
{{e-paper Hat color FAQ}}<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/PCIe_TO_M.2_Board_(C)PCIe TO M.2 Board (C)2024-03-11T06:08:49Z<p>Eng52: /* Load */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|img=[[File:PCIe TO M.2 Board (C).jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/pcie-to-m.2-board-c.htm}}]]<br />
|caption=<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
=Overview=<br />
'''PCIe TO M.2 Board (C) for Raspberry Pi 5, Compatible With 2230 / 2242 / 2260 / 2280 Size M.2 Solid State Drive, Supports Gen2 And Gen3 Modes, Supports Booting PI5 From Solid State Drive, Side-mounting solution, Comes with Acrylic Mounting Plate.'''<br />
==Features==<br />
*Support NVMe protocol M.2 interface Solid State Drive protocol, high-speed write/read, higher working efficiency.<br />
*Support PCI-E×1 Gen2 or Gen3 mode.<br />
*Only support PI5B.<br />
*Compatible With 2230 / 2242 / 2260 / 2280 Size M.2 Solid State Drive.<br />
*Onboard working indicator, light-up when powering on, and the ACT status indicator keeps blinking while writing/reading.<br />
==Note==<br />
*As Raspberry Pi does not support booting from NVME by default, you need to modify it to realize it.<br />
<br />
=User Guide=<br />
==Hardware Connection==<br />
Please pay attention to the wiring direction, as shown below:<br><br />
[[File:PCIe TO M.2 Board (C)-Wire.png]]<br />
==Load==<br />
1: Enable PCIE interface:<br />
PCIE interface is not enabled on the Raspberry Pi 5 by default, you can add the following content at /boot/firmware/config.txt:<br />
dtparam=pciex1<br />
2: PCIE gen2 is the default setting, if you want to enable PCIE gen3, you need to add the following content at /boot/firmware/config.txt:<br />
dtparam=pciex1_gen=3<br />
3: Reboot PI5 after modification, and the device can be recognized. <br />
In the picture below, the SM2263 is recognized as my SSD solid state, and the other PI5 one is the RPI chip<br />
[[file:PCIe TO M.2 HAT+_W_1.png]]<br />
4: Partitioning: If partitioning and formatting have already been performed on another platform, skip this step. Caution: Partitioning and formatting will erase all data on the SSD, so proceed with caution.<br />
lsblk #see the disk (execute "sudo fdisk -l" for more details)<br />
[[file:PCIe TO M.2 HAT+_W_2.png]]<br />
Partition <br />
sudo fdisk /dev/nvme0n1 #dev is the total device number, do not add "p1", just one partition <br />
How do use fdisk<br />
n New partition<br />
q Exit without saving<br />
p Print partition table <br />
m Print selection menu <br />
d Delete partition <br />
w Save and exit<br />
t Modify ID <br />
Add the partition and execute "n", and then press "w" to save and exit. <br />
<br />
5: Format:<br />
sudo mkfs. #Then, pressing Tab will display various file extensions. Each extension corresponds to a format you may want to format the drive into<br />
[[file:PCIe TO M.2 HAT+_W_3.png]]<br />
If I need to format it in "ext4" format, execute: <br />
sudo mkfs.ext4 /dev/nvme0n1p1<br />
Wait for a moment, when "done" appears for all, it means the formatting is complete.<br />
[[file:PCIe TO M.2 HAT+_W_4.png]]<br />
<br />
6: Load:<br />
Create the mounting directory:<br />
sudo mkdir toshiba<br />
Mount the device<br />
sudo mount /dev/nvme0n1p1 ./toshiba<br />
Check disk status<br />
df -h<br />
<br />
===Read/Write Test===<br />
Enter the directory to mount the disk:<br />
<pre><br />
cd toshiba<br />
</pre><br />
*Release the caches:<br />
<pre><br />
sudo sh -c "sync && echo 3 > /proc/sys/vm/drop_caches"<br />
</pre><br />
*Copy the Raspberry Pi memory to the hard flash driver (write).<br />
<pre><br />
sudo dd if=/dev/zero of=./test_write count=2000 bs=1024k<br />
</pre><br />
[[File:PCIe TO M.2 HAT+_W_5.png|800px]]<br />
*Copy the contents of the hard drive to the Raspberry Pi's memory. (/etc/fstab read).<br />
<pre><br />
sudo dd if=./test_write of=/dev/null count=2000 bs=1024k<br />
</pre><br />
[[File:pcie-m2-6new.png|800px]]<br />
*Note: Different cards and environments may have different results. Raspberry Pi's performance is significantly affected, so for accurate performance testing, it's recommended to use a PC.<br/><br />
<br />
===Auto Mount ===<br />
If there are no issues with the test and the disk is not needed as a system disk, only for expanding disk usage, set up automatic mounting.<br />
<pre><br />
sudo nano /etc/fstab<br />
<br />
#Add the following content at the end:<br />
/dev/nvme0n1p1 /home/pi/toshiba ext4 defaults 0 0<br />
#/dev/nvme0n1p1 device name, /home/pi/toshiba mount to the directory, ext4 is the file system type, defaults means using the default mounting options <br />
#Reboot to take effect (Please make sure there are no issues before rebooting, otherwise it can not be booted without mounting) <br />
sudo mount -a<br />
<br />
#And then reboot<br />
Check the device through lsblk <br />
</pre><br />
<br />
==Booting from NVMe SSD ==<br />
1: First, you can use an SD card to boot the Raspberry Pi, just test it to make sure the hardware can work properly. <br><br />
<br />
2: Use the SD card to boot the Raspberry Pi and modify the config file, modify BOOT_ORDER:<br />
sudo rpi-eeprom-config --edit <br />
Modify BOOT_ORDER=0xf41 as BOOT_ORDER=0xf416<br />
[[file:PCIe TO M.2 HAT+_W_6.png]]<br />
For more details, you can refer to [https://www.raspberrypi.com/documentation/computers/raspberry-pi.html#raspberry-pi-bootloader-configuration BOOT_ORDER ]<br />
<br />
3: Reboot the Raspberry Pi, and you can see the following content in serial port log during start-up:<br />
[[file:PCIe TO M.2 HAT+_W_7.png]]<br />
That means the modification is successful. <br />
If you fail after trying several times, you can connect it to the network before modify again (wait for network time synchronization), or set the correct time before modifying the file.<br />
4: Flash the system to NVME, and then connect to the board, remove the SD card, and power it on again.<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/1.83inch_LCD_Module1.83inch LCD Module2024-03-09T03:44:09Z<p>Eng52: /* Specifications */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|img=[[File:1.83inch LCD Module.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/1.83inch-lcd-module.htm}}]]<br />
|caption=1.83inch<br>240 × 280, SPI<br><br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
= Introduction =<br />
'''This product provides Raspberry Pi, STM32, Arduino examples'''<br />
<br />
{{Amazon|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|More = [https://www.waveshare.com/1.8inch-lcd-module.htm More]}}<br />
<br />
==Specifications==<br />
*Operating voltage: 3.3V/5V <font color="red"><big>'''(Please ensure that the supply voltage and logic voltage are consistent; otherwise, it may result in the device not working properly.)'''</big></font><br />
*Interface: SPI<br />
*LCD type: IPS<br />
*Driver: NV3030B<br />
*Resolution: 240(H)RGB x 280(V)<br />
*Display size: 30.197 x 35.230 mm<br />
*Pixel size: 0.1258(H) x 0.1258(V) mm<br />
*Dimension: 33 × 40 mm<br />
<br />
=Interface Description=<br />
The 1.83-inch LCD uses a GH1.25 8PIN interface. Please refer to the pin connection table according to the specific hardware. <font color="red">(Connect according to the pin definition table; the wire colors in the diagram are for reference only, actual colors may vary). </font><br />
<br />
==Raspberry Pi hardware connection==<br />
Please connect the LCD to your Raspberry Pi by the 8PIN cable according to the table below.<br /><br />
Use the pin header or PH2.0 8PIN interface, you need to connect according to the following table.<br><br />
{|border=1; style="width:700px;" align="center"<br />
|+Connect to Raspberry Pi<br />
|-style="background:green; color:white;" align="center"<br />
|rowspan="2"| LCD<br />
| colspan="2" | Raspberry Pi<br />
|-style="background:green; color:white;" align="center"<br />
|BCM2835<br />
|Board<br />
|- align="center"<br />
|VCC||3.3V||3.3V<br />
|- align="center"<br />
|GND||GND||GND<br />
|-align="center"<br />
|DIN||MOSI||19<br />
|-align="center"<br />
|CLK||SCLK||23<br />
|-align="center"<br />
|CS||CE0||24<br />
|-align="center"<br />
|DS||25||22<br />
|-align="center"<br />
|RST||27||13<br />
|-align="center"<br />
|BL||18||12<br />
|}<br />
As shown below:<br><br />
<div align="center"><br />
[[File:1.83-rpi.jpg|800px]]<br /><br />
</div><br />
==STM32 hardware connection==<br />
The provided example is based on the [https://www.waveshare.com/xnucleo-f103rb.htm XNUCLEO-F103RB] development board, and the connection method corresponds to the pins of the STM32F103RBT6. If there is a need to port the demo, please connect according to the actual pinout.<br><br />
<br />
{|border=1; style="width:600px;" align="center"<br />
|+STM32F103ZET connection pin correspondence<br />
|-style="background:green; color:white;" align="center"<br />
|LCD||STM32<br />
|-align="center"<br />
|VCC||3.3V<br />
|-align="center"<br />
|GND||GND<br />
|-align="center"<br />
|DIN||PA7<br />
|-align="center"<br />
|CLK||PA5<br />
|-align="center"<br />
|CS||PB6<br />
|-align="center"<br />
|DC||PA8<br />
|-align="center"<br />
|RST||PA9<br />
|-align="center"<br />
|BL||PC7<br />
|}<br />
The connection diagram is as follows (click to enlarge):<br /><br />
<div align="center"><br />
[[File:1.83-STM32.jpg|800px]]<br /><br />
</div><br />
==Arduino hardware connection==<br />
The provided example is based on the [https://www.waveshare.com/r3-plus.htm ATmega328P] development board, and the connection method corresponds to the pins of the Arduino UNO R3. If there is a need to port the program, please connect according to the actual pinout.<br />
{|border=1; style="width:600px;" align="center"<br />
|+Arduino UNO Connection pin correspondence<br />
|-style="background:green; color:white;" align="center"<br />
|LCD||UNO <br />
|-align="center"<br />
|VCC||5V<br />
|-align="center"<br />
|GND||GND<br />
|-align="center"<br />
|DIN||D11<br />
|-align="center"<br />
|CLK||D13<br />
|-align="center"<br />
|CS||D10<br />
|-align="center"<br />
|DC||D7<br />
|-align="center"<br />
|RST||D8<br />
|-align="center"<br />
|BL||D9<br />
|}<br />
The connection diagram is as follows (click to enlarge):<br /><br />
<div align="center"><br />
[[File:1.83-Aduino.jpg|800px]]<br /><br />
</div><br />
==ESP32 hardware connection==<br />
The provided example is based on the [https://www.waveshare.com/esp32-s3-dev-kit-n8r8.htmESP32-S3-WROOM-1-N8R8] development board, and the connection method corresponds to the pins of the ESP32-S3. If there is a need to port the demo, please connect according to the actual pinout.<br />
{|border=1; style="width:600px;" align="center"<br />
|-style="background:green; color:white;" align="center"<br />
|LCD||ESP32 <br />
|-align="center"<br />
|VCC||3V3<br />
|-align="center"<br />
|GND||GND<br />
|-align="center"<br />
|DIN||IO11<br />
|-align="center"<br />
|CLK||IO12<br />
|-align="center"<br />
|CS||IO10<br />
|-align="center"<br />
|DC||IO46<br />
|-align="center"<br />
|RST||IO3<br />
|-align="center"<br />
|BL||IO8<br />
|}<br />
The connection diagram is as follows (click to enlarge):<br /><br />
<div align="center"><br />
[[File:1.83inch-LCD-Module-ESP32.jpg|800px]]<br /><br />
</div><br />
==Pico hardware connection==<br />
The provided example is based on the [https://www.waveshare.com/raspberry-pi-pico.htm Raspberry Pi Pico], and the connection method corresponds to the pins of the Raspberry Pi Pico. If there is a need to port the demo, please connect according to the actual pinout.<br />
{|border=1; style="width:600px;" align="center"<br />
|-style="background:green; color:white;" align="center"<br />
|LCD||Pico<br />
|-align="center"<br />
|VCC||3.3V<br />
|-align="center"<br />
|GND||GND<br />
|-align="center"<br />
|DIN||GP11<br />
|-align="center"<br />
|CLK||GP10<br />
|-align="center"<br />
|CS||GP9<br />
|-align="center"<br />
|DC||GP8<br />
|-align="center"<br />
|RST||GP12<br />
|-align="center"<br />
|BL||GP13<br />
|}<br />
For example, the connection diagram for the Pico is as follows (click to enlarge):<br /><br />
<div align="center"><br />
[[File:1.83inch-LCD-Module-Pico.jpg|800px]]<br /><br />
</div><br />
<br />
==LCD and the controller==<br />
This LCD utilizes an internal controller, the NV3030B, which is a 240 x RGB x 320 pixel LCD controller. However, the LCD itself has a resolution of 240 (H) RGB x 280 (V) pixels. Additionally, since the initialization control allows for both landscape and portrait orientations, not all of the LCD's internal RAM is fully utilized.<br><br />
The LCD supports three color formats for input per pixel: 8-bit, 9-bit, 16-bit, and 18-bit. These formats include RGB444, RGB565, and RGB666. This example program uses the RGB565 color format, which is commonly used.<br><br />
Communication with the LCD is achieved via a four-wire SPI interface. This approach significantly reduces the GPIO usage and also ensures relatively fast communication speed.<br />
<br />
==Communication Protocol==<br />
[[file:0.96inch_lcd_module_spi.png|900px]]<br /><br />
Note: Different from the traditional SPI protocol, the data line from the slave to the master is hidden since the device only has display requirement.<br /><br />
RESX Is the reset pin, it should be low when powering the module and be higher at other times;<br /><br />
CSX is slave chip select, when CS is low, the chip is enabled.<br /><br />
D/CX is data/command control pin, when DC = 0, write command, when DC = 1, write data<br /><br />
SDA is the data pin for transmitting RGB data, it works as the MOSI pin of SPI interface;<br /><br />
SCL worka s the SCLK pins of SPI interface.<br /><br />
SPI communication has data transfer timing, which is combined by CPHA and CPOL.<br /><br />
CPOL determines the level of the serial synchronous clock at idle state. When CPOL = 0, the level is Low. However, CPOL has little effect to the transmission.<br /><br />
CPHA determines whether data is collected at the first clock edge or at the second clock edge of serial synchronous clock; when CPHL = 0, data is collected at the first clock edge.<br /><br />
There are 4 SPI communication modes. SPI0 is commonly used, in which CPHL = 0, CPOL = 0.<br /><br />
<br />
<br />
=Working with Raspberry Pi=<br />
==Enable SPI interface==<br />
<div class="cautionSec">PS: If you are using the system of the Bullseye branch, you need to change "apt-get" to "apt", the system of the Bullseye branch only supports Python3. </div><br />
*Open terminal, use command to enter the configuration page<br /><br />
<pre><br />
sudo raspi-config<br />
Choose Interfacing Options -> SPI -> Yes to enable SPI interface<br />
</pre><br />
[[file:RPI_open_spi.png|900px]]<br /><br />
Reboot Raspberry Pi:<br /><br />
<pre><br />
sudo reboot<br />
</pre><br />
Please make sure the SPI is not occupied by other devices, you can check in the middle of /boot/config.txt<br /><br />
<br />
{{RPI_C_lib}}<br />
===Python===<br />
<pre><br />
#python2<br />
sudo apt-get update<br />
sudo apt-get install python-pip<br />
sudo apt-get install python-pil<br />
sudo apt-get install python-numpy<br />
sudo pip install RPi.GPIO<br />
sudo pip install spidev<br />
#python3<br />
sudo apt-get update<br />
sudo apt-get install python3-pip<br />
sudo apt-get install python3-pil<br />
sudo apt-get install python3-numpy<br />
sudo pip3 install RPi.GPIO<br />
sudo pip3 install spidev<br />
</pre><br />
<br />
==Download Examples==<br />
Open the Raspberry Pi terminal and run the following command:<br /><br />
<pre><br />
sudo apt-get install unzip -y<br />
sudo wget https://files.waveshare.com/wiki/1.83inch-LCD-Module/LCD_1.83_Code.zip<br />
sudo unzip ./LCD_1.83_Code.zip<br />
cd LCD_1.83_Code/RaspberryPi/<br />
</pre><br />
<br />
==Run the demo codes==<br />
Please go into the RaspberryPi directory (demo codes) first and run the commands in the terminal.<br /><br />
===C codes===<br />
*Re-compile the demo codes.<br /><br />
<pre><br />
cd c<br />
sudo make clean<br />
sudo make -j 8<br />
</pre><br />
<br />
*The test demo of all screens can be called directly by entering the corresponding size:<br />
sudo ./main Screen Size<br />
Depending on the LCD, one of the following commands should be entered:<br />
<pre><br />
#0.85inch LCD Module<br />
sudo ./main 0.85<br />
#0.96inch LCD Module<br />
sudo ./main 0.96<br />
#1.14inch LCD Module<br />
sudo ./main 1.14<br />
#1.28inch LCD Module<br />
sudo ./main 1.28<br />
#1.3inch LCD Module<br />
sudo ./main 1.3<br />
#1.47inch LCD Module<br />
sudo ./main 1.47<br />
#1.5inch LCD Module<br />
sudo ./main 1.5<br />
#1.54inch LCD Module<br />
sudo ./main 1.54<br />
#1.8inch LCD Module<br />
sudo ./main 1.8<br />
#1.83inch LCD Module<br />
sudo ./main 1.83<br />
#2inch LCD Module<br />
sudo ./main 2<br />
#2.4inch LCD Module<br />
sudo ./main 2.4<br />
</pre><br />
<br />
===python===<br />
*Enter the Python program directory and run the command ls -l.<br /><br />
<pre><br />
cd python/examples<br />
ls -l<br />
</pre><br />
[[File:LCD_rpi_python_examples.png|1000px]]<br /><br />
Test programs for all screens can be viewed, sorted by size:<br><br />
0inch85_LCD_test.py 0.85inch LCD test program<br><br />
0inch96_LCD_test.py 0.96inch LCD test program<br><br />
1inch14_LCD_test.py 1.14inch LCD test program<br><br />
1inch28_LCD_test.py 1.28inch LCD test program<br><br />
1inch3_LCD_test.py 1.3inch LCD test program<br><br />
1inch47_LCD_test.py 1.47inch LCD test program<br><br />
1inch5_LCD_test.py 1.5inch LCD test program<br><br />
1inch54_LCD_test.py 1.54inchLCD test program<br><br />
1inch8_LCD_test.py 1.8inch LCD test program<br><br />
1inch83_LCD_test.py 1.83inch LCD test program<br><br />
2inch_LCD_test.py 2inch LCD test program<br><br />
2inch4_LCD_test.py 2.4inch LCD test program<br><br />
<br />
*Just run the program corresponding to the screen, the program supports python2/3.<br />
<pre><br />
# python2<br />
sudo python 0inch85_LCD_test.py<br />
sudo python 0inch96_LCD_test.py<br />
sudo python 1inch14_LCD_test.py<br />
sudo python 1inch28_LCD_test.py<br />
sudo python 1inch3_LCD_test.py<br />
sudo python 1inch47_LCD_test.py<br />
sudo python 1inch5_LCD_test.py<br />
sudo python 1inch54_LCD_test.py<br />
sudo python 1inch8_LCD_test.py<br />
sudo python 1inch83_LCD_test.py<br />
sudo python 2inch_LCD_test.py<br />
sudo python 2inch4_LCD_test.py<br />
# python3<br />
sudo python3 0inch85_LCD_test.py<br />
sudo python3 0inch96_LCD_test.py<br />
sudo python3 1inch14_LCD_test.py<br />
sudo python3 1inch28_LCD_test.py<br />
sudo python3 1inch3_LCD_test.py<br />
sudo python3 1inch47_LCD_test.py<br />
sudo python3 1inch5_LCD_test.py<br />
sudo python3 1inch54_LCD_test.py<br />
sudo python3 1inch8_LCD_test.py<br />
sudo python3 1inch83_LCD_test.py<br />
sudo python3 2inch_LCD_test.py<br />
sudo python3 2inch4_LCD_test.py<br />
</pre><br />
The RaspberryPi series can share a set of programs, because they are all embedded systems, and the compatibility is relatively strong.<br><br />
The program is divided into bottom-layer hardware interface, middle-layer LCD screen driver, and upper-layer application;<br />
==Hardware Interface==<br />
We have carried out the low-level encapsulation, if you need to know the internal implementation can go to the corresponding directory to check, for the reason the hardware platform and the internal implementation are different. <br><br />
You can open DEV_Config.c(.h) to see definitions, which in the directory RaspberryPi\c\lib\Config.<br><br />
<pre><br />
1. There are three ways for C to drive: BCM2835 library, WiringPi library, and Dev library respectively<br />
2. We use Dev libraries by default. If you need to change to BCM2835 or WiringPi libraries, please open RaspberryPi\c\Makefile and modify lines 13-15 as follows: <br />
</pre><br />
[[File:RPI_open_spi1.png|900px]]<br /><br />
*Data type:<br />
<pre><br />
#define UBYTE uint8_t<br />
#define UWORD uint16_t<br />
#define UDOUBLE uint32_t<br />
</pre><br />
*Module initialization and exit processing.<br />
<pre><br />
void DEV_Module_Init(void);<br />
void DEV_Module_Exit(void);<br />
Note: <br />
Here is some GPIO processing before and after using the LCD screen.<br />
</pre><br />
*GPIO read and write:<br />
<pre><br />
void DEV_Digital_Write(UWORD Pin, UBYTE Value);<br />
UBYTE DEV_Digital_Read(UWORD Pin);<br />
</pre><br />
*SPI write data:<br />
<pre><br />
void DEV_SPI_WriteByte(UBYTE Value);<br />
</pre><br />
<br />
==Upper application==<br />
If you need to draw pictures or display Chinese and English characters, we provide some basic functions here about some graphics processing in the directory RaspberryPi\c\lib\GUI\GUI_Paint.c(.h).<br><br />
[[File:LCD_rpi_GUI.png|900px]]<br /><br />
The fonts can be found in RaspberryPi\c\lib\Fonts directory.<br /><br />
[[File:RPI_open_spi3.png|900px]]<br /><br />
<br />
*New Image Properties: Create a new image buffer, this property includes the image buffer name, width, height, flip Angle, and color.<br /><br />
<pre><br />
void Paint_NewImage(UBYTE *image, UWORD Width, UWORD Height, UWORD Rotate, UWORD Color)<br />
Parameters:<br />
Image: the name of the image buffer, which is actually a pointer to the first address of the image buffer;<br />
Width: image buffer Width;<br />
Height: the Height of the image buffer;<br />
Rotate: Indicates the rotation Angle of an image<br />
Color: the initial Color of the image;<br />
</pre><br />
<br />
*Select image buffer: The purpose of the selection is that you can create multiple image attributes, there can be multiple images buffer, you can select each image you create.<br /><br />
<pre><br />
void Paint_SelectImage(UBYTE *image)<br />
Parameters:<br />
Image: the name of the image buffer, which is actually a pointer to the first address of the image buffer;<br />
</pre><br />
<br />
*Image Rotation: Set the rotation Angle of the selected image, preferably after Paint_SelectImage(), you can choose to rotate 0, 90, 180, 270.<br /><br />
[[File:Rotation-lcd.png]]<br/><br />
<pre><br />
void Paint_SetRotate(UWORD Rotate)<br />
Parameters:<br />
Rotate: ROTATE_0, ROTATE_90, ROTATE_180, and ROTATE_270 correspond to 0, 90, 180, and 270 degrees.<br />
</pre><br />
<br />
*Image mirror flip: Set the mirror flip of the selected image. You can choose no mirror, horizontal mirror, vertical mirror, or image center mirror.<br /><br />
<pre><br />
void Paint_SetMirroring(UBYTE mirror)<br />
Parameters:<br />
Mirror: indicates the image mirroring mode. MIRROR_NONE, MIRROR_HORIZONTAL, MIRROR_VERTICAL, MIRROR_ORIGIN correspond to no mirror, horizontal mirror, vertical mirror, and image center mirror respectively.<br />
</pre><br />
<br />
*Set points of the display position and color in the buffer: here is the core GUI function, processing points display position and color in the buffer.<br /><br />
<pre><br />
void Paint_SetPixel(UWORD Xpoint, UWORD Ypoint, UWORD Color)<br />
Parameters:<br />
Xpoint: the X position of a point in the image buffer<br />
Ypoint: Y position of a point in the image buffer<br />
Color: indicates the Color of the dot<br />
</pre><br />
<br />
*Image buffer fill color: Fills the image buffer with a color, usually used to flash the screen into blank.<br />
<pre><br />
void Paint_Clear(UWORD Color)<br />
Parameters:<br />
Color: fill Color<br />
</pre><br />
<br />
*The fill color of a certain window in the image buffer: the image buffer part of the window filled with a certain color, usually used to fresh the screen into blank, often used for time display, fresh the last second of the screen.<br />
<pre><br />
void Paint_ClearWindows(UWORD Xstart, UWORD Ystart, UWORD Xend, UWORD Yend, UWORD Color)<br />
Parameters:<br />
Xstart: the x-starting coordinate of the window<br />
Ystart: the y-starting coordinate of the window<br />
Xend: the x-end coordinate of the window<br />
Yend: the y-end coordinate of the window<br />
Color: fill Color<br />
</pre><br />
<br />
*Draw point: In the image buffer, draw points on (Xpoint, Ypoint), you can choose the color, the size of the point, the style of the point.<br /><br />
<pre><br />
void Paint_DrawPoint(UWORD Xpoint, UWORD Ypoint, UWORD Color, DOT_PIXEL Dot_Pixel, DOT_STYLE Dot_Style)<br />
Parameters:<br />
Xpoint: indicates the X coordinate of a point.<br />
Ypoint: indicates the Y coordinate of a point.<br />
Color: fill Color<br />
Dot_Pixel: The size of the dot, the demo provides 8 size pointss by default.<br />
typedef enum {<br />
DOT_PIXEL_1X1 = 1, // 1 x 1<br />
DOT_PIXEL_2X2 , // 2 X 2<br />
DOT_PIXEL_3X3 , // 3 X 3<br />
DOT_PIXEL_4X4 , // 4 X 4<br />
DOT_PIXEL_5X5 , // 5 X 5<br />
DOT_PIXEL_6X6 , // 6 X 6<br />
DOT_PIXEL_7X7 , // 7 X 7<br />
DOT_PIXEL_8X8 , // 8 X 8<br />
} DOT_PIXEL;<br />
Dot_Style: the size of a point that expands from the center of the point or from the bottom left corner of the point to the right and up.<br />
typedef enum {<br />
DOT_FILL_AROUND = 1,<br />
DOT_FILL_RIGHTUP,<br />
} DOT_STYLE;<br />
</pre><br />
<br />
*Draw line: In the image buffer, draw line from (Xstart, Ystart) to (Xend, Yend), you can choose the color, the width and the style of the line.<br /><br />
<pre><br />
void Paint_DrawLine(UWORD Xstart, UWORD Ystart, UWORD Xend, UWORD Yend, UWORD Color, LINE_STYLE Line_Style , LINE_STYLE Line_Style)<br />
Parameters:<br />
Xstart: the x-starting coordinate of a line<br />
Ystart: the y-starting coordinate of the a line<br />
Xend: the x-end coordinate of a line<br />
Yend: the y-end coordinate of a line<br />
Color: fill Color<br />
Line_width: The width of the line, the demo provides 8 sizes of width by default.<br />
typedef enum {<br />
DOT_PIXEL_1X1 = 1, // 1 x 1<br />
DOT_PIXEL_2X2 , // 2 X 2<br />
DOT_PIXEL_3X3 , // 3 X 3<br />
DOT_PIXEL_4X4 , // 4 X 4<br />
DOT_PIXEL_5X5 , // 5 X 5<br />
DOT_PIXEL_6X6 , // 6 X 6<br />
DOT_PIXEL_7X7 , // 7 X 7<br />
DOT_PIXEL_8X8 , // 8 X 8<br />
} DOT_PIXEL;<br />
Line_Style: line style. Select whether the lines are joined in a straight or dashed way.<br />
typedef enum {<br />
LINE_STYLE_SOLID = 0,<br />
LINE_STYLE_DOTTED,<br />
} LINE_STYLE;<br />
</pre><br />
<br />
* Draw rectangle: In the image buffer, draw a rectangle from (Xstart, Ystart) to (Xend, Yend), you can choose the color, the width of the line, whether to fill the inside of the rectangle.<br /><br />
<pre><br />
void Paint_DrawRectangle(UWORD Xstart, UWORD Ystart, UWORD Xend, UWORD Yend, UWORD Color, DOT_PIXEL Line_width, DRAW_FILL Draw_Fill)<br />
Parameters:<br />
Xstart: the starting X coordinate of the rectangle<br />
Ystart: the starting Y coordinate of the rectangle<br />
Xend: the x-end coordinate of the rectangle<br />
Yend: the y-end coordinate of the rectangle<br />
Color: fill Color<br />
Line_width: The width of the four sides of a rectangle. And the demo provides 8 sizes of width by default.<br />
typedef enum {<br />
DOT_PIXEL_1X1 = 1, // 1 x 1<br />
DOT_PIXEL_2X2 , // 2 X 2<br />
DOT_PIXEL_3X3 , // 3 X 3<br />
DOT_PIXEL_4X4 , // 4 X 4<br />
DOT_PIXEL_5X5 , // 5 X 5<br />
DOT_PIXEL_6X6 , // 6 X 6<br />
DOT_PIXEL_7X7 , // 7 X 7<br />
DOT_PIXEL_8X8 , // 8 X 8<br />
} DOT_PIXEL;<br />
Draw_Fill: Fill, whether to fill the inside of the rectangle<br />
typedef enum {<br />
DRAW_FILL_EMPTY = 0,<br />
DRAW_FILL_FULL,<br />
} DRAW_FILL;<br />
</pre><br />
*Draw circle: In the image buffer, draw a circle of Radius with (X_Center Y_Center) as the center. You can choose the color, the width of the line, and whether to fill the inside of the circle.<br /><br />
<pre><br />
void Paint_DrawCircle(UWORD X_Center, UWORD Y_Center, UWORD Radius, UWORD Color, DOT_PIXEL Line_width, DRAW_FILL Draw_Fill)<br />
Parameters:<br />
X_Center: the x-coordinate of the center of the circle<br />
Y_Center: the y-coordinate of the center of the circle<br />
Radius: indicates the Radius of a circle<br />
Color: fill Color<br />
Line_width: The width of the arc, with a default of 8 widths<br />
typedef enum {<br />
DOT_PIXEL_1X1 = 1, // 1 x 1<br />
DOT_PIXEL_2X2 , // 2 X 2<br />
DOT_PIXEL_3X3 , // 3 X 3<br />
DOT_PIXEL_4X4 , // 4 X 4<br />
DOT_PIXEL_5X5 , // 5 X 5<br />
DOT_PIXEL_6X6 , // 6 X 6<br />
DOT_PIXEL_7X7 , // 7 X 7<br />
DOT_PIXEL_8X8 , // 8 X 8<br />
} DOT_PIXEL;<br />
Draw_Fill: fill, whether to fill the inside of the circle<br />
typedef enum {<br />
DRAW_FILL_EMPTY = 0,<br />
DRAW_FILL_FULL,<br />
} DRAW_FILL;<br />
</pre><br />
*Write Ascii character: In the image buffer, use (Xstart Ystart) as the left vertex, write an Ascii character, you can select Ascii visual character library, font foreground color, font background color.<br />
<pre><br />
void Paint_DrawChar(UWORD Xstart, UWORD Ystart, const char Ascii_Char, sFONT* Font, UWORD Color_Foreground, UWORD Color_Background)<br />
Parameters:<br />
Xstart: the x-coordinate of the left vertex of a character<br />
Ystart: the Y-coordinate of the left vertex of a character<br />
Ascii_Char: indicates the Ascii character<br />
Font: Ascii visual character library, in the Fonts folder the demo provides the following Fonts:<br />
Font8: 5*8 font<br />
Font12: 7*12 font<br />
Font16: 11*16 font<br />
Font20: 14*20 font<br />
Font24: 17*24 font<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
*Write English string: In the image buffer, use (Xstart Ystart) as the left vertex, write a string of English characters, you can choose Ascii visual character library, font foreground color, font background color.<br /><br />
<pre><br />
void Paint_DrawString_EN(UWORD Xstart, UWORD Ystart, const char * pString, sFONT* Font, UWORD Color_Foreground, UWORD Color_Background)<br />
Parameters:<br />
Xstart: the x-coordinate of the left vertex of a character<br />
Ystart: the Y coordinate of the font's left vertex<br />
PString: string, string is a pointer<br />
Font: Ascii visual character library, in the Fonts folder the demo provides the following Fonts:<br />
Font8: 5*8 font<br />
Font12: 7*12 font<br />
Font16: 11*16 font<br />
Font20: 14*20 font<br />
Font24: 17*24 font<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
*Write Chinese string: in the image buffer, use (Xstart Ystart) as the left vertex, write a string of Chinese characters, you can choose character font, font foreground color, and font background color of the GB2312 encoding.<br />
<pre><br />
void Paint_DrawString_CN(UWORD Xstart, UWORD Ystart, const char * pString, cFONT* font, UWORD Color_Foreground, UWORD Color_Background)<br />
Parameters:<br />
Xstart: the x-coordinate of the left vertex of a character<br />
Ystart: the Y coordinate of the font's left vertex<br />
PString: string, string is a pointer<br />
Font: GB2312 encoding character Font library, in the Fonts folder the demo provides the following Fonts:<br />
Font12CN: ASCII font 11*21, Chinese font 16*21<br />
Font24CN: ASCII font24 *41, Chinese font 32*41<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
*Write numbers: In the image buffer,use (Xstart Ystart) as the left vertex, write a string of numbers, you can choose Ascii visual character library, font foreground color, font background color.<br />
<pre><br />
void Paint_DrawNum(UWORD Xpoint, UWORD Ypoint, double Nummber, sFONT* Font, UWORD Digit, UWORD Color_Foreground, UWORD Color_Background)<br />
Parameters:<br />
Xpoint: the x-coordinate of the left vertex of a character<br />
Ypoint: the Y coordinate of the left vertex of the font<br />
Nummber: indicates the number displayed, which can be a decimal<br />
Digit: It's a decimal number<br />
Font: Ascii visual character library, in the Fonts folder the demo provides the following Fonts:<br />
Font8: 5*8 font<br />
Font12: 7*12 font<br />
Font16: 11*16 font<br />
Font20: 14*20 font<br />
Font24: 17*24 font<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
*Display time: in the image buffer,use (Xstart Ystart) as the left vertex, display time,you can choose Ascii visual character font, font foreground color, font background color.<br />
<pre><br />
void Paint_DrawTime(UWORD Xstart, UWORD Ystart, PAINT_TIME *pTime, sFONT* Font, UWORD Color_Background, UWORD Color_Foreground)<br />
Parameters:<br />
Xstart: the x-coordinate of the left vertex of a character<br />
Ystart: the Y coordinate of the font's left vertex<br />
PTime: display time, A time structure is defined here, as long as the hours, minutes, and seconds are passed to the parameters;<br />
Font: Ascii visual character library, in the Fonts folder the demo provides the following Fonts:<br />
Font8: 5*8 font<br />
Font12: 7*12 font<br />
Font16: 11*16 font<br />
Font20: 14*20 font<br />
Font24: 17*24 font<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
<br />
*Read the local bmp image and write it to the cache.<br />
For Linux operating systems such as Raspberry Pi, you can read and write pictures.<br />
For Raspberry Pi, in the directory: RaspberryPi\c\lib\GUI\GUI_BMPfile.c(.h).<br />
<pre><br />
UBYTE GUI_ReadBmp(const char *path, UWORD Xstart, UWORD Ystart)<br />
parameter:<br />
path: the relative path of the BMP image<br />
Xstart: The X coordinate of the left vertex of the image, generally 0 is passed by default<br />
Ystart: The Y coordinate of the left vertex of the picture, generally 0 by default<br />
</pre><br />
<br />
===Python (for Raspberry Pi)===<br />
Works with python and python3.<br><br />
For python, his calls are not as complicated as C.<br><br />
Raspberry Pi: RaspberryPi\python\lib\<br><br />
:[[File:1.83LCD_rpi_python_lib.jpg|800px]]<br /><br />
====lcdconfig.py====<br />
*Module initialization and exit processing.<br />
<pre><br />
def module_init()<br />
def module_exit()<br />
Note:<br />
1. Here is some GPIO processing before and after using the LCD screen.<br />
2. The module_init() function is automatically called in the INIT () initializer on the LCD, but the module_exit() function needs to be called by itself.<br />
</pre><br />
*GPIO read and write:<br />
<pre><br />
def digital_write(pin, value)<br />
def digital_read(pin)<br />
</pre><br />
*SPI write data.<br />
def spi_writebyte(data)<br />
*xxx_LCD_test.py (xxx indicates the size, if it is a 0.96inch LCD, it is 0inch96_LCD_test.py, and so on)<br />
<br />
python is in the following directory:<br><br />
Raspberry Pi: RaspberryPi\python\examples\<br />
:[[File:LCD_rpi_python_examples2.png|800px]]<br /><br />
If your python version is python2 and you need to run the 0.96inch LCD test program, re-execute it as follows in linux command mode:<br />
sudo python 0inch96_LCD_test.py<br />
If your python version is python3 and you need to run the 0.96inch LCD test program, re-execute the following in linux command mode:<br />
sudo python3 0inch96_LCD_test.py<br />
====About Rotation Settings====<br />
If you need to set the screen rotation in the python program, you can set it by the statement im_r= image1.rotate(270).<br />
<pre><br />
im_r= image1.rotate(270)<br />
</pre><br />
: Rotation effect, take 1.54 as an example, the order is 0°, 90°, 180°, 270°<br />
:[[File:LCD_Rotate.jpg|1000px]]<br /><br />
===='''GUI Functions'''====<br />
Python has an image library [http://effbot.org/imagingbook PIL official library link], it does not need to write code from the logical layer like C and can directly call the image library for image processing. The following will take a 1.54-inch LCD as an example, we provide a brief description of the demo.<br />
*It needs to use the image library and install the library.<br />
sudo apt-get install python3-pil <br />
And then import the library<br /><br />
from PIL import Image,ImageDraw,ImageFont.<br />
Among them, Image is the basic library, ImageDraw is the drawing function, and ImageFont is the text function.<br />
*Define an image cache to facilitate drawing, writing, and other functions on the picture.<br />
image1 = Image.new("RGB", (disp.width, disp.height), "WHITE")<br />
The first parameter defines the color depth of the picture, defined as "RGB" means RGB888 color image, the second parameter is a tuple, defines the width and height of the picture, the third parameter defines the default color of the cache, defined as "WHITE".<br />
<br />
*Create a drawing object based on Image1 on which all drawing operations will be performed on here.<br />
draw = ImageDraw.Draw(image1)<br />
*Draw a line.<br />
draw.line([(20, 10),(70, 60)], fill = "RED",width = 1)<br />
The first parameter is a 4-element tuple that draws a straight line with (20, 10) as the start point and (70, 60) as the end point. fill="RED" means that the line is red, and width=1 means that the line is 1 pixel wide.<br />
*Draw a rectangle.<br />
draw.rectangle([(20,10),(70,60)],fill = "WHITE",outline="BLUE")<br />
The first argument is a tuple of four elements. (20,10) is the coordinate value in the upper left corner of the rectangle, and (70,60) is the coordinate value in the lower right corner of the rectangle. outline="BLUE" means the color of the outline is blue.<br />
<br />
*Draw a circle.<br />
draw.arc((150,15,190,55),0, 360, fill =(0,255,0)<br />
Draw an inscribed circle in the square, the first parameter is a tuple of 4 elements, with (150, 15) as the upper left corner vertex of the square, (190, 55) as the lower right corner vertex of the square, specifying the level median line of the rectangular frame is the angle of 0 degrees, the second parameter indicates the starting angle, the third parameter indicates the ending angle, and fill =(0,255,0) indicates that the color of the line is green.<br />
If the figure is not square according to the coordination, you will get an ellipse.<br />
<br />
Besides the arc function, you can also use the chord function for drawing a solid circle.<br />
draw.ellipse((150,65,190,105), fill = 0)<br />
The first parameter is the coordination of the enclosing rectangle. The second and third parameters are the beginning and end degrees of the circle. The fourth parameter is the fill color of the circle.<br />
*Character.<br />
The ImageFont module needs to be imported and instantiated:<br />
<pre><br />
Font1 = ImageFont.truetype("../Font/Font01.ttf",25)<br />
Font2 = ImageFont.truetype("../Font/Font01.ttf",35)<br />
Font3 = ImageFont.truetype("../Font/Font02.ttf",32)<br />
</pre><br />
You can use the fonts of Windows or other fonts which is in ttf format..<br /><br />
Note: Each character library contains different characters; If some characters cannot be displayed, it is recommended that you refer to the encoding set to be used.<br />
To draw English characters, you can directly use the fonts; for Chinese characters, you need to add a symbol u:<br />
<pre><br />
draw.text((40, 50), 'WaveShare', fill = (128,255,128),font=Font2)<br />
text= u"微雪电子"<br />
draw.text((74, 150),text, fill = "WHITE",font=Font3)<br />
</pre><br />
The first parameter is a tuple of 2 elements, with (40, 50) as the left vertex, the font is Font2, and the fill is the font color. You can directly make fill = "WHITE", because the regular color value is already defined Well, of course, you can also use fill = (128,255,128), the parentheses correspond to the values of the three RGB colors so that you can precisely control the color you want. The second sentence shows Waveshare Electronics, using Font3, the font color is white.<br/><br />
*read local image<br />
image = Image.open('../pic/LCD_1inch28.jpg')<br />
The parameter is the image path.<br />
*Other functions.<br />
For more information, you can refer to http://effbot.org/imagingbook pil<br />
<br />
{{LCD for stm32cubeide use}}<br />
<br />
=Arduino Software Description=<br />
Note: The demos are all tested on Arduino uno. If you need other types of Arduino, you need to determine whether the connected pins are correct. <br /><br />
[[Template: Arduino IDE Installation Steps]]<br />
==Run program==<br />
In the product encyclopedia interface download [https://files.waveshare.com/upload/e/e9/LCD_Module_code.7z the program], and then unzip it. The Arduino program is located at ~/Arduino/… <br /><br />
[[File:LCD_arduino_cede1.png|700px]]<br /><br />
<br />
Please select the corresponding 1.83 demo according to the LCD screen model to open:<br /><br />
[[File:1.83inch LCD_Arduino.jpg|700px]]<br /><br />
Open the demo and select the development board model Arduino UNO:<br />
[[File: Arduino for 1.69inch lcd module03.jpg|700px]]<br /><br />
Select the corresponding COM port:<br /><br />
[[File: Arduino for 1.69inch lcd module04.jpg|700px]]<br /><br />
Then click to compile and download.<br /><br />
[[File:1.83LCD_arduino_cede5.jpg|700px]]<br /><br />
<br />
==Program Description==<br />
===Document Introduction===<br />
Take Arduino UNO controlling a 1.83-inch LCD as an example, open the Arduino\LCD_1inch83 directory:<br /><br />
[[File:1.83LCD_arduino_ide_codeDescription1.jpg|800px]]<br /><br />
Of which: <br /><br />
LCD_1inch83.ino: open with Arduino IDE;<br /><br />
LCD_Driver.cpp(.h): is the driver of the LCD screen;<br /><br />
DEV_Config.cpp(.h): It is the hardware interface definition, which encapsulates the read and write pin levels, SPI transmission data, and pin initialization;<br /><br />
font8.cpp, font12.cpp, font16.cpp, font20.cpp, font24.cpp, font24CN.cpp, fonts.h: fonts for characters of different sizes;<br /><br />
image.cpp(.h): is the image data, which can convert any BMP image into a 16-bit true color image array through Img2Lcd (downloadable in the development data). <br /><br />
The program is divided into bottom-layer hardware interface, middle-layer LCD screen driver, and upper-layer application;<br /><br />
<br />
===Bottom Hardware Interface===<br />
The hardware interface is defined in the two files DEV_Config.cpp(.h), and functions such as read and write pin level, delay, and SPI transmission are encapsulated. <br /><br />
<br />
*Write pin level:<br />
<pre><br />
void DEV_Digital_Write(int pin, int value)<br />
</pre><br />
The first parameter is the pin, and the second is the high and low level. <br /><br />
<br />
*Read pin level:<br />
<pre><br />
int DEV_Digital_Read(int pin)<br />
</pre><br />
The parameter is the pin, and the return value is the level of the read pin. <br /><br />
<br />
*Delay:<br />
<pre><br />
DEV_Delay_ms(unsigned int delaytime)<br />
</pre><br />
millisecond level delay. <br /><br />
<br />
*SPI output data:<br />
<pre><br />
DEV_SPI_WRITE(unsigned char data)<br />
</pre><br />
The parameter is char type, occupying 8 bits. <br /><br />
<br />
===The upper application===<br />
For the screen, if you need to draw pictures, display Chinese and English characters, display pictures, etc., you can use the upper application to do, and we provide some basic functions here about some graphics processing in the directory GUI_Paint.c(.h)<br /><br />
Note: Because of the size of the internal RAM of STM32 and arduino, the GUI is directly written to the RAM of the LCD. <br /><br />
[[File:LCD_arduino_ide_codeDescription_GUI.png|700px]]<br /><br />
The fonts used by the GUI all depend on the font*.cpp(h) files under the same file<br /><br />
[[File:LCD_arduino_ide_codeDescription_font.png|700px]]<br /><br />
<br />
*New Image Properties: Create a new image property, this property includes the image buffer name, width, height, flip Angle, color.<br/><br />
<pre><br />
void Paint_NewImage(UWORD Width, UWORD Height, UWORD Rotate, UWORD Color)<br />
Parameters:<br />
Width: image buffer Width;<br />
Height: the Height of the image buffer;<br />
Rotate: Indicates the rotation Angle of an image<br />
Color: the initial Color of the image;<br />
</pre><br />
*Set the clear screen function, usually call the clear function of LCD directly.<br />
<pre><br />
void Paint_SetClearFuntion(void (*Clear)(UWORD));<br />
parameter:<br />
Clear : Pointer to the clear screen function, used to quickly clear the screen to a certain color;<br />
</pre><br />
*Set the drawing pixel function.<br />
<pre><br />
void Paint_SetDisplayFuntion(void (*Display)(UWORD,UWORD,UWORD));<br />
parameter:<br />
Display: Pointer to the pixel drawing function, which is used to write data to the specified location in the internal RAM of the LCD;<br />
</pre><br />
*Select image buffer:the purpose of the selection is that you can create multiple image attributes, image buffer can exist multiple, you can select each image you create.<br />
<pre><br />
void Paint_SelectImage(UBYTE *image)<br />
Parameters:<br />
Image: the name of the image cache, which is actually a pointer to the first address of the image buffer<br />
</pre><br />
*Image Rotation: Set the selected image rotation Angle, preferably after Paint_SelectImage(), you can choose to rotate 0, 90, 180, 270.<br />
<pre><br />
void Paint_SetRotate(UWORD Rotate)<br />
Parameters:<br />
Rotate: ROTATE_0, ROTATE_90, ROTATE_180, and ROTATE_270 correspond to 0, 90, 180, and 270 degrees respectively;<br />
</pre><br />
【Note】 Under different selection angles, the coordinates correspond to different starting pixel points, here take 1.14 as an example, four pictures, in the order of 0 °, 90 °, 180 °, 270 °. Just for reference:<br><br />
[[File:rotation-lcd.png]]<br/><br />
*Image mirror flip: Set the mirror flip of the selected image. You can choose no mirror, horizontal mirror, vertical mirror,or image center mirror.<br />
<pre><br />
void Paint_SetMirroring(UBYTE mirror)<br />
Parameters:<br />
Mirror: indicates the image mirroring mode. MIRROR_NONE, MIRROR_HORIZONTAL, MIRROR_VERTICAL, MIRROR_ORIGIN correspond to no mirror, horizontal mirror, vertical mirror, and about image center mirror respectively.<br />
</pre><br />
*Set points of display position and color in the buffer: here is the core GUI function, processing points display position and color in the buffer.<br />
<pre><br />
void Paint_SetPixel(UWORD Xpoint, UWORD Ypoint, UWORD Color)<br />
Parameters:<br />
Xpoint: the X position of a point in the image buffer<br />
Ypoint: Y position of a point in the image buffer<br />
Color: indicates the Color of the dot<br />
</pre><br />
*Image buffer fill color: Fills the image buffer with a color, usually used to flash the screen into blank.<br />
<pre><br />
void Paint_ClearWindows(UWORD Xstart, UWORD Ystart, UWORD Xend, UWORD Yend, UWORD Color)<br />
Parameters:<br />
Xstart: the x-starting coordinate of the window<br />
Ystart: indicates the Y starting point of the window<br />
Xend: the x-end coordinate of the window<br />
Yend: indicates the y-end coordinate of the window<br />
Color: fill Color<br />
</pre><br />
*Draw points: In the image buffer, draw points on (Xpoint, Ypoint), you can choose the color, the size of the point, the style of the point.<br />
<pre><br />
void Paint_DrawPoint(UWORD Xpoint, UWORD Ypoint, UWORD Color, DOT_PIXEL Dot_Pixel, DOT_STYLE Dot_Style)<br />
Parameters:<br />
Xpoint: indicates the X coordinate of a point<br />
Ypoint: indicates the Y coordinate of a point<br />
Color: fill Color<br />
Dot_Pixel: The size of the dot, providing a default of eight size points<br />
typedef enum {<br />
DOT_PIXEL_1X1 = 1, // 1 x 1<br />
DOT_PIXEL_2X2 , // 2 X 2<br />
DOT_PIXEL_3X3 , // 3 X 3<br />
DOT_PIXEL_4X4 , // 4 X 4<br />
DOT_PIXEL_5X5 , // 5 X 5<br />
DOT_PIXEL_6X6 , // 6 X 6<br />
DOT_PIXEL_7X7 , // 7 X 7<br />
DOT_PIXEL_8X8 , // 8 X 8<br />
} DOT_PIXEL;<br />
Dot_Style: the size of a point that expands from the center of the point or from the bottom left corner of the point to the right and up<br />
typedef enum {<br />
DOT_FILL_AROUND = 1,<br />
DOT_FILL_RIGHTUP,<br />
} DOT_STYLE;<br />
</pre><br />
*Line drawing: In the image buffer, line from (Xstart, Ystart) to (Xend, Yend), you can choose the color, line width, line style. <br />
<pre><br />
void Paint_DrawLine(UWORD Xstart, UWORD Ystart, UWORD Xend, UWORD Yend, UWORD Color, LINE_STYLE Line_Style , LINE_STYLE Line_Style)<br />
Parameters:<br />
Xstart: the x-starting coordinate of a line<br />
Ystart: indicates the Y starting point of a line<br />
Xend: x-terminus of a line<br />
Yend: the y-end coordinate of a line<br />
Color: fill Color<br />
Line_width: The width of the line, which provides a default of eight widths<br />
typedef enum {<br />
DOT_PIXEL_1X1 = 1, // 1 x 1<br />
DOT_PIXEL_2X2 , // 2 X 2<br />
DOT_PIXEL_3X3 , // 3 X 3<br />
DOT_PIXEL_4X4 , // 4 X 4<br />
DOT_PIXEL_5X5 , // 5 X 5<br />
DOT_PIXEL_6X6 , // 6 X 6<br />
DOT_PIXEL_7X7 , // 7 X 7<br />
DOT_PIXEL_8X8 , // 8 X 8<br />
} DOT_PIXEL;<br />
Line_Style: line style. Select whether the lines are joined in a straight or dashed way<br />
typedef enum {<br />
LINE_STYLE_SOLID = 0,<br />
LINE_STYLE_DOTTED,<br />
} LINE_STYLE;<br />
</pre><br />
* Draw rectangle: In the image buffer, draw a rectangle from (Xstart, Ystart) to (Xend, Yend), you can choose the color, the width of the line, whether to fill the inside of the rectangle. <br />
<pre><br />
void Paint_DrawRectangle(UWORD Xstart, UWORD Ystart, UWORD Xend, UWORD Yend, UWORD Color, DOT_PIXEL Line_width, DRAW_FILL Draw_Fill)<br />
Parameters:<br />
Xstart: the starting X coordinate of the rectangle<br />
Ystart: indicates the Y starting point of the rectangle<br />
Xend: X terminus of the rectangle<br />
Yend: specifies the y-end coordinate of the rectangle<br />
Color: fill Color<br />
Line_width: The width of the four sides of a rectangle. Default eight widths are provided<br />
typedef enum {<br />
DOT_PIXEL_1X1 = 1, // 1 x 1<br />
DOT_PIXEL_2X2 , // 2 X 2<br />
DOT_PIXEL_3X3 , // 3 X 3<br />
DOT_PIXEL_4X4 , // 4 X 4<br />
DOT_PIXEL_5X5 , // 5 X 5<br />
DOT_PIXEL_6X6 , // 6 X 6<br />
DOT_PIXEL_7X7 , // 7 X 7<br />
DOT_PIXEL_8X8 , // 8 X 8<br />
} DOT_PIXEL;<br />
Draw_Fill: Fill, whether to fill the inside of the rectangle<br />
typedef enum {<br />
DRAW_FILL_EMPTY = 0,<br />
DRAW_FILL_FULL,<br />
} DRAW_FILL;<br />
</pre><br />
*Draw circle: In the image buffer, draw a circle of Radius with (X_Center Y_Center) as the center. You can choose the color, the width of the line, and whether to fill the inside of the circle.<br />
<pre><br />
void Paint_DrawCircle(UWORD X_Center, UWORD Y_Center, UWORD Radius, UWORD Color, DOT_PIXEL Line_width, DRAW_FILL Draw_Fill)<br />
Parameters:<br />
X_Center: the x-coordinate of the center of a circle<br />
Y_Center: Y coordinate of the center of a circle<br />
Radius: indicates the Radius of a circle<br />
Color: fill Color<br />
Line_width: The width of the arc, with a default of 8 widths<br />
typedef enum {<br />
DOT_PIXEL_1X1 = 1, // 1 x 1<br />
DOT_PIXEL_2X2 , // 2 X 2<br />
DOT_PIXEL_3X3 , // 3 X 3<br />
DOT_PIXEL_4X4 , // 4 X 4<br />
DOT_PIXEL_5X5 , // 5 X 5<br />
DOT_PIXEL_6X6 , // 6 X 6<br />
DOT_PIXEL_7X7 , // 7 X 7<br />
DOT_PIXEL_8X8 , // 8 X 8<br />
} DOT_PIXEL;<br />
Draw_Fill: fill, whether to fill the inside of the circle<br />
typedef enum {<br />
DRAW_FILL_EMPTY = 0,<br />
DRAW_FILL_FULL,<br />
} DRAW_FILL;<br />
</pre><br />
*Write Ascii character: In the image buffer, at (Xstart Ystart) as the left vertex, write an Ascii character, you can select Ascii visual character library, font foreground color, font background color.<br />
<pre><br />
void Paint_DrawChar(UWORD Xstart, UWORD Ystart, const char Ascii_Char, sFONT* Font, UWORD Color_Foreground, UWORD Color_Background)<br />
Parameters:<br />
Xstart: the x-coordinate of the left vertex of a character<br />
Ystart: the Y coordinate of the font's left vertex<br />
Ascii_Char: indicates the Ascii character<br />
Font: Ascii visual character library, in the Fonts folder provides the following Fonts:<br />
Font8: 5*8 font<br />
Font12: 7*12 font<br />
Font16: 11*16 font<br />
Font20: 14*20 font<br />
Font24: 17*24 font<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
*Write English string: In the image buffer, use (Xstart Ystart) as the left vertex, write a string of English characters, can choose Ascii visual character library, font foreground color, font background color.<br />
<pre><br />
void Paint_DrawString_EN(UWORD Xstart, UWORD Ystart, const char * pString, sFONT* Font, UWORD Color_Foreground, UWORD Color_Background)<br />
Parameters:<br />
Xstart: the x-coordinate of the left vertex of a character<br />
Ystart: the Y coordinate of the font's left vertex<br />
PString: string, string is a pointer<br />
Font: Ascii visual character library, in the Fonts folder provides the following Fonts:<br />
Font8: 5*8 font<br />
Font12: 7*12 font<br />
Font16: 11*16 font<br />
Font20: 14*20 font<br />
Font24: 17*24 font<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
*Write Chinese string: in the image buffer, use (Xstart Ystart) as the left vertex, write a string of Chinese characters, you can choose GB2312 encoding character font, font foreground color, font background color.<br />
<pre><br />
void Paint_DrawString_CN(UWORD Xstart, UWORD Ystart, const char * pString, cFONT* font, UWORD Color_Foreground, UWORD Color_Background)<br />
Parameters:<br />
Xstart: the x-coordinate of the left vertex of a character<br />
Ystart: the Y coordinate of the font's left vertex<br />
PString: string, string is a pointer<br />
Font: GB2312 encoding character Font library, in the Fonts folder provides the following Fonts:<br />
Font12CN: ASCII font 11*21, Chinese font 16*21<br />
Font24CN: ASCII font24 *41, Chinese font 32*41<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
*Write numbers: In the image buffer,use (Xstart Ystart) as the left vertex, write a string of numbers, you can choose Ascii visual character library, font foreground color, font background color.<br />
<pre><br />
void Paint_DrawNum(UWORD Xpoint, UWORD Ypoint, double Nummber, sFONT* Font, UWORD Digit, UWORD Color_Foreground, UWORD Color_Background)<br />
Parameters:<br />
Xpoint: the x-coordinate of the left vertex of a character<br />
Ypoint: the Y coordinate of the left vertex of the font<br />
Nummber: indicates the number displayed, which can be a decimal<br />
Digit: It's a decimal number<br />
Font: Ascii visual character library, in the Fonts folder provides the following Fonts:<br />
Font8: 5*8 font<br />
Font12: 7*12 font<br />
Font16: 11*16 font<br />
Font20: 14*20 font<br />
Font24: 17*24 font<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
*Write numbers with decimals: at (Xstart Ystart) as the left vertex, write a string of numbers with decimals, you can choose Ascii code visual character font, font foreground color, font background color<br />
<pre><br />
void Paint_DrawFloatNum(UWORD Xpoint, UWORD Ypoint, double Nummber, UBYTE Decimal_Point, sFONT* Font, UWORD Color_Foreground, UWORD Color_Background);<br />
parameter:<br />
Xstart: the X coordinate of the left vertex of the character<br />
Ystart: Y coordinate of the left vertex of the font<br />
Nummber: the displayed number, which is saved in double type here<br />
Decimal_Point: Displays the number of digits after the decimal point<br />
Font: Ascii code visual character font library, the following fonts are provided in the Fonts folder:<br />
Font8: 5*8 font<br />
Font12: 7*12 font<br />
Font16: 11*16 font<br />
Font20: 14*20 font<br />
Font24: 17*24 font<br />
Color_Foreground: font color<br />
Color_Background: background color<br />
</pre><br />
*Display time: in the image buffer,use (Xstart Ystart) as the left vertex, display time,you can choose Ascii visual character font, font foreground color, font background color.<br />
<pre><br />
void Paint_DrawTime(UWORD Xstart, UWORD Ystart, PAINT_TIME *pTime, sFONT* Font, UWORD Color_Background, UWORD Color_Foreground)<br />
Parameters:<br />
Xstart: the x-coordinate of the left vertex of a character<br />
Ystart: the Y coordinate of the font's left vertex<br />
PTime: display time, here defined a good time structure, as long as the hour, minute and second bits of data to the parameter;<br />
Font: Ascii visual character library, in the Fonts folder provides the following Fonts:<br />
Font8: 5*8 font<br />
Font12: 7*12 font<br />
Font16: 11*16 font<br />
Font20: 14*20 font<br />
Font24: 17*24 font<br />
Color_Foreground: Font color<br />
Color_Background: indicates the background color<br />
</pre><br />
*Display image: at (Xstart Ystart) as the left vertex, display an image whose width is W_Image and height is H_Image;<br />
<pre><br />
void Paint_DrawImage(const unsigned char *image, UWORD xStart, UWORD yStart, UWORD W_Image, UWORD H_Image)<br />
parameter:<br />
image: image address, pointing to the image information you want to display<br />
Xstart: the X coordinate of the left vertex of the character<br />
Ystart: Y coordinate of the left vertex of the font<br />
W_Image: Image width<br />
H_Image: Image height<br />
</pre><br />
{{LCD1.83 for Pico use}}<br />
{{LCD1.83 for ESP32 use}}<br />
<br />
=Resource=<br />
==Document==<br />
*[https://files.waveshare.com/wiki/1.83inch-LCD-Module/1.83inch_LCD_Module-Sch.pdf Schematic]<br />
*[https://files.waveshare.com/wiki/1.5inch-LCD-Module/NV3030B_datasheet_V0.6_20220118.pdf NV3030B]<br />
==Demo==<br />
*[https://files.waveshare.com/wiki/1.83inch-LCD-Module/LCD_1.83_Code.zip Demo]<br />
==Software==<br />
*[https://files.waveshare.com/upload/7/78/LcmZimo.zip lcd]<br />
*[https://files.waveshare.com/upload/b/bd/Image2Lcd2.9.zip Image2Lcd]<br />
*[https://www.waveshare.com/wiki/Image_Extraction Image Extraction]<br />
<br />
=FAQ=<br />
{{FAQ|What is the maximum brightness of 1.83inch LCD Module?<br />
|<br />
3.3V 500cd/㎡<br />
||}}<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/0.49inch_OLED_Module0.49inch OLED Module2024-03-08T08:04:14Z<p>Eng52: /* FAQ */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|img=[[File: 0.49inch OLED Module.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/0.49inch-oled-module.htm}}]]<br />
|caption=64×32<br/>0.49inch<br/>I2C<br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
=Overview=<br />
This product is a 0.49-inch OLED module with a built-in SSD1315 driver chip and provides demos for Raspberry Pi, Pico, Arduino, STM32, and so on.<br />
==Specification==<br />
*Operating voltage: 5V/3.3V<br />
*Communication interface: I2C<br />
*Chip: SSD1315<br />
*Resolution: 64 × 32<br />
*Display size: 14.4 × 11.5(mm)<br />
*Pixel size: 0.155 × 0.155(mm)<br />
*Board size: 15.50 × 13.00(mm)<br />
*Display color: Black, White <br />
==Pinout==<br />
{|border=1; style="width:600px;" align="center"<br />
|-style="background:green; color:white;" align="center"<br />
|Pinout||I2C<br />
|-align="center"<br />
|VCC<br />
|colspan="2"|3.3V / 5V power input<br />
|-align="center"<br />
|GND<br />
|colspan="2"|Ground<br />
|-align="center"<br />
|SDA||I2C data input<br />
|-align="center"<br />
|SCL||I2C clock input<br />
|}<br />
==LCD and the controller==<br />
SSD1315 is an OLED controller with 128 x 64 pixels, but this OLED only supports 64*32 pixels, so this screen only uses the first half of the SSD1315 buffer area.<br /><br />
This OLED supports communication methods such as 8bit 8080 parallel, 8bit 6800 parallel, three-wire SPI, four-wire SPI, and I2C. However, the first four communication methods are not adopted, we only adopt the I2C communication as the result of the size of the module and the limited IO resources.<br />
==Working Protocol==<br />
[[file:0.91inch_oled_module_i2c3.png|800px]]<br /><br />
Note: In I2C communication, first send a 7bit slave device address + 1bit read and write bits, and wait for the response from the device.<br /><br />
After the slave device responds, it then sends a control byte, which determines whether the byte sent later is a command or data, and then waits for the slave device to respond.<br /><br />
After the slave responds again, if the command, only a one-byte command can be sent. If sending data, only one byte can be sent, or multiple bytes of data can be sent in succession, depending on the situation.<br /><br />
[https://files.waveshare.com/upload/a/af/SSD1306-Revision_1.1.pdf More details refer to Datasheet Page20 Figure 8-7]<br /><br />
{{0.49 OLED RPI Guides}}<br />
{{0.49 OLED STM Guides}}<br />
{{0.49 OLED Arduino Guides}}<br />
{{0.49 OLED Pico Guides}}<br />
=Resource=<br />
==Document==<br />
*[https://files.waveshare.com/wiki/0.49inch-OLED-Module/0.49inch_OLED_Module_Schematic.pdf Schematic]<br />
==Demo==<br />
*[https://files.waveshare.com/upload/2/2c/OLED_Module_Code.7z Demo Code]<br />
==Software==<br />
*[https://files.waveshare.com/upload/c/c6/Zimo221.7z Zimo221]<br />
*[https://files.waveshare.com/upload/b/bd/Image2Lcd2.9.zip Image2Lcd]<br />
*[https://www.waveshare.com/wiki/Image_Extraction Image Extraction Guide]<br />
==Datasheet==<br />
*[https://files.waveshare.com/upload/f/f0/SSD1315_1.1.pdf SSD1315 Datasheet]<br />
=FAQ=<br />
{{0.96-OLED-FAQ}}<br />
<br />
=Support=<br />
{{Servicebox1}}</div>Eng52https://www.waveshare.com/wiki/ESP32-S3-Relay-6CHESP32-S3-Relay-6CH2024-03-07T10:50:33Z<p>Eng52: /* Schematic */</p>
<hr />
<div><div class="wiki-pages jet-green-color"><br />
{{Infobox item|colorscheme=green<br />
|img=[[File:ESP32-S3-Relay-6CH.jpg|300px|{{Amazon_nolink|default={{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}|url=link=https://www.waveshare.com/esp32-s3-relay-6ch.htm}}]]<br />
|caption=6CH<br/>RS485, Pico, USB<br/><br />
|brand=Waveshare<br />
|{{#ifeq: {{#urlget:amazon|0}}|{{#urlget:Amazon|0}}| default|}}=display<br />
|website_cn=[https://www.waveshare.net/shop 官网]<br />
|website_en=[https://www.waveshare.com Website]<br />
}}<br />
=Overview=<br />
==Parameter== <br />
{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;"|Items<br />
|style="background:green; color:white;text-align:center;" |Parameters<br />
|-<br />
|style="text-align:center;"|Supply Voltage<br />
|style="text-align:center;" |7~36V (or 5V/1A Type-C Interface)<br />
|-<br />
|style="text-align:center;"|Relay channels<br />
|style="text-align:center;" |6 CH<br />
|-<br />
|style="text-align:center;"|Contact form<br />
|style="text-align:center;" |1NO 1NC<br />
|-<br />
|style="text-align:center;"|Interface<br />
|style="text-align:center;" |Type-C<br />
|-<br />
|style="text-align:center;"|Communication Protocol<br />
|style="text-align:center;" |USB Protocol<br />
|-<br />
|style="text-align:center;"|Dimensions<br />
|style="text-align:center;" |88(H) x 122(V)mm<br />
|}<br />
<br />
==Onboard Interface==<br />
{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;"|GPIO Control<br />
|style="background:green; color:white;text-align:center;" |Function<br />
|style="background:green; color:white;text-align:center;" |Description<br />
|-<br />
|style="text-align:center;"|GP0<br />
|style="text-align:center;" |BOOT Key<br />
|style="text-align:center;" |BOOT Key Control Pin<br />
|-<br />
|style="text-align:center;"|GP21<br />
|style="text-align:center;" |BUZZER<br />
|style="text-align:center;" |Buzzer Control Pin <br />
|-<br />
|style="text-align:center;"|GP38<br />
|style="text-align:center;" |RGB<br />
|style="text-align:center;" |RGB LED Control Pin <br />
|-<br />
|style="text-align:center;"|GP1<br />
|style="text-align:center;" |CH1<br />
|style="text-align:center;" |Relay No.1 Control Pin <br />
|-<br />
|style="text-align:center;"|GP2<br />
|style="text-align:center;" |CH2<br />
|style="text-align:center;" |Relay No.2 Control Pin <br />
|-<br />
|style="text-align:center;"|GP41<br />
|style="text-align:center;" |CH3<br />
|style="text-align:center;" |Relay No.3 Control Pin <br />
|-<br />
|style="text-align:center;"|GP42<br />
|style="text-align:center;" |CH4<br />
|style="text-align:center;" |Relay No.4 Control Pin <br />
|-<br />
|style="text-align:center;"|GP45<br />
|style="text-align:center;" |CH5<br />
|style="text-align:center;" |Relay No.5 Control Pin <br />
|-<br />
|style="text-align:center;"|GP46<br />
|style="text-align:center;" |CH6<br />
|style="text-align:center;" |Relay No.6 Control Pin <br />
|-<br />
|style="text-align:center;"|GP17<br />
|style="text-align:center;" |TXD<br />
|style="text-align:center;" |UART TX, converted to RS485 <br />
|-<br />
|style="text-align:center;"|GP18<br />
|style="text-align:center;" |RXD<br />
|style="text-align:center;" |UART RX pin, converted to RS485<br />
|}<br />
= Multifunction Control =<br />
== Example Analyze ==<br />
* Here we provide four examples for performing the following functions. '''Users can write their examples to realize other functions, these examples provided are only for simple operation of the device.'''<br />
* ''' The factory default demo is MAIN_WIFI_AP.'''<br />
:{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:#666666; color:white;text-align:center;"|File<br />
|style="background:#666666; color:white;text-align:center;" |Function <br />
|style="background:#666666; color:white;text-align:center;" |Note<br />
|-<br />
| style="text-align:center;"colspan="1" rowspan="3" | '''MAIN_WIFI_AP'''<br />
| style="text-align:center;"|RS485 Interface Control <br />
| style="text-align:center;" rowspan="3" |Support directly programmed<br>Only support when the Web interface connects to WIFI<br><br />
|-<br />
| style="text-align:center;"|Bluetooth Control, Bluetooth sends IP <br />
|-<br />
| style="text-align:center;"|Web Interface Control (Close) <br />
|-<br />
|<br />
|-<br />
| style="text-align:center;"colspan="1" rowspan="3" | '''MAIN_WIFI_STA'''<br />
| style="text-align:center;"|RS485 Interface Control<br />
| style="text-align:center;" rowspan="3" |<font color="red">Only support after modify </font><br>Require WIFI connection after [[#Demo Modify| Modify]]<br>Web page for intranet use only<br><br />
|-<br />
| style="text-align:center;"|Bluetooth control, Bluetooth sends IP <br />
|-<br />
| style="text-align:center;"|Web Interface Control (short-distance)<br />
|-<br />
|<br />
|-<br />
| style="text-align:center;"colspan="1" rowspan="3" | '''MAIN_WIFI_MQTT'''<br />
| style="text-align:center;"|RS485 Interface Control<br />
| style="text-align:center;" rowspan="3" |<font color="red">Only support after modify </font><br>Require WIFI connection after [[#Demo Modify| Modify]]<br>Require [[#Create Device|Create]] device on Waveshare cloud platform <br><br />
|-<br />
| style="text-align:center;"|Bluetooth Control, Bluetooth sends IP <br />
|-<br />
| style="text-align:center;"|Waveshare Cloud Control (long-distance)<br />
|-<br />
|<br />
|-<br />
| style="text-align:center;"colspan="1" rowspan="4" | '''MAIN_ALL'''<br />
| style="text-align:center;"|RS485 Interface Control <br />
| style="text-align:center;" rowspan="4" |<font color="red">Only support after modify </font><br>Require WIFI connection after [[#Demo Modify| Modify]]<br>Require [[#Create Device|Create]] device on Waveshare cloud platform<br>Web page for intranet use only<br><br />
|-<br />
| style="text-align:center;"|Bluetooth control, Bluetooth sends IP <br />
|-<br />
| style="text-align:center;"|Web Interface Control (short-distance)<br />
|-<br />
| style="text-align:center;"|Waveshare Cloud Control (long-distance)<br />
|}<br />
<br />
== Preparation ==<br />
=== Install Library Files ===<br />
* Install ArduinoJson library:<br />
[[File:ESP32-S3-Relay-6CH TO Lib 1.png]]<br />
* Install PubSubClient library:<br />
[[File:ESP32-S3-Relay-6CH TO Lib 2.png]]<br />
* Install NTPClient library:<br />
[[File:ESP32-S3-Relay-6CH TO Lib 3.png]]<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
=== Demo Modify===<br />
<div class="mw-collapsible-content"><br />
* There is a ''' WS_Information.h''' file in each example, please change the content of the file to user information.<br />
* Take '''MAIN_ALL''' as an example:<br />
[[File:ESP32-S3-Relay-6CH TO Burnprocess 1.png]]<br />
* Open ''' WS_Information.h''' file:<br />
[[File:ESP32-S3-Relay-6CH TO Burnprocess 2.png]]<br />
*Modify the WIFI "Waveshare-TL" and the password "waveshare".<br />
* For example, change the WIFI to be connected as "Waveshare-WIFI", and the password is "123456789", as shown below: <br />
[[File:ESP32-S3-Relay-6CH TO Burnprocess 3.png]]<br />
* If you need to control relays on the Waveshare cloud platform, the device needs to be created first. <br />
* Modify the related data according to the devices on the Waveshare cloud platform. <br />
[[File:ESP32-S3-Relay-6CH TO Burnprocess 4.png]]<br />
* As modified as below:<br />
[[File:ESP32-S3-Relay-6CH TO Burnprocess 5.png]]<br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
=== Demo Flash ===<br />
<div class="mw-collapsible-content"><br />
*Demo modification is complete, after connecting the device, select the development board and COM port. <br />
[[File:ESP32-S3-Relay-6CH TO Burnprocess 6.png]]<br />
[[File:ESP32-S3-Relay-6CH TO Burnprocess 7.png]]<br />
*Click to flash.<br />
[[File:ESP32-S3-Relay-6CH TO Burnprocess 8.png]]<br />
</div></div><br />
<br />
== RS485 Control ==<br />
Connect the RS485 device to ESP32-S3-Relay-6CH, send commands to ESP32-S3-Relay-6CH for each relay ON/OFF, and the baud rate is 115200 by default.<br />
:{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;"|Command<br />
|style="background:green; color:white;text-align:center;" |Function<br />
|-<br />
|style="text-align:center;"|06 05 00 01 55 00 A2 ED<br />
|style="text-align:center;" |Switch CH1 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 05 00 02 55 00 52 ED<br />
|style="text-align:center;" |Switch CH2 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 05 00 03 55 00 03 2D<br />
|style="text-align:center;" |Switch CH3 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 05 00 04 55 00 B2 EC<br />
|style="text-align:center;" |Switch CH4 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 05 00 05 55 00 E3 2C<br />
|style="text-align:center;" |Switch CH5 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 05 00 06 55 00 13 2C<br />
|style="text-align:center;" |Switch CH6 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 05 00 FF FF 00 BD BD<br />
|style="text-align:center;" |All relays ON<br />
|-<br />
|style="text-align:center;"|06 05 00 FF 00 00 FC 4D <br />
|style="text-align:center;" |All relays OFF<br />
|}<br />
=== Hardware Connection===<br />
&nbsp;&nbsp; Take [https://www.waveshare.com/usb-to-4ch-serial-converter.htm USB TO 4CH Serial Converter] for example:<br />
:{|border=1; style="width:800px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;width:400px;" | ESP32-S3-Relay-6CH<br />
|style="background:green; color:white;text-align:center;" | USB TO 4CH Serial Converter - PORT B<br />
|-<br />
|style="text-align:center;" |RS485 - A+<br />
|style="text-align:center;" |Port B - A+<br />
|-<br />
|style="text-align:center;" |RS485 - B-<br />
|style="text-align:center;" |Port B - B-<br />
|}<br />
<br />
=== Software Operation ===<br />
* Sending data through [https://files.waveshare.com/wiki/ESP32-S3-Relay-6CH/SSCOM5.13.1_For_ESP32_S3_Relay_6CH.zip SSCOM].<br />
* Open SSCOM, and select the COM port corresponding to Port B connected as above.<br />
[[File:ESP32-S3-Relay-6CH TO RS485 1.png]]<br />
* Open COM, and use the multiple send function to quickly send commands.<br />
[[File:ESP32-S3-Relay-6CH TO RS485 2.png]]<br />
* With the following commands, you can control the relays. <br />
[[File:ESP32-S3-Relay-6CH TO RS485 3.png]]<br />
<br />
== Web Interface control ==<br />
Connect your mobile phone to ESP32-S3-Relay-6CH with Bluetooth Debugging Assitant, and then get the IP after the WIFI connection, you can enter the Web interface with this IP. <br />
=== AP Mode ===<br />
* Connect to ESP32-S3-Relay-6CH's WIFI, and the WIFI name is "ESP32-S3-Relay-6CH", and the password is "waveshare".<br />
* '''[[#Software Operation (Get IP) | Get Current IP ]]''' through Bluetooth Debugging Assistant.<br />
* Enter the Web interface, and you can control relays. (It takes a little while to configure the device after powering up, and the first time you enter the web page after each power-up may be slow.)<br />
[[File:ESP32-S3-Relay-6CH TO Web 1.png]]<br />
=== STA Mode ===<br />
* After power on, it automatically connects to the configured WIFI (''' Before programming, you need to [[#Demo Modify| Modify WIFI to be Connected]]''') <br />
* '''[[#Software Operation (Get IP) | Get Current IP ]]''' through Bluetooth Debugging Assistant.<br />
* Enter the Web interface, and you can control relays. (It takes a little while to configure the device after powering up, and the first time you enter the web page after each power-up may be slow.)<br />
[[File:ESP32-S3-Relay-6CH TO Web 2.png]]<br />
<br />
== Bluetooth Control ==<br />
Use your mobile phone to connect ESP32-S3-Relay-6CH through Bluetooth Debugging Assistant, and send commands to ESP32-S3-Relay-6CH for controlling all relays ON/OFF. <br><br />
'''Please note that some Bluetooth debugging assistants send data in ASCII form by default, you need to enter the correct control commands according to the Bluetooth debugging assistant before controlling the device.'''<br><br />
:{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;"colspan="2" |Command<br />
|style="background:green; color:white;text-align:center;" rowspan="2"|Function<br />
|-<br />
|style="background:green; color:white;text-align:center;"|ASCII<br />
|style="background:green; color:white;text-align:center;"|Hex<br />
|-<br />
|style="text-align:center;"|1<br />
|style="text-align:center;"|0x31<br />
|style="text-align:center;" |Switch CH1 Relay ON/OFF <br />
|-<br />
|style="text-align:center;"|2<br />
|style="text-align:center;"|0x32<br />
|style="text-align:center;" |Switch CH2 Relay ON/OFF <br />
|-<br />
|style="text-align:center;"|3<br />
|style="text-align:center;"|0x33<br />
|style="text-align:center;" |Switch CH3 Relay ON/OFF <br />
|-<br />
|style="text-align:center;"|4<br />
|style="text-align:center;"|0x34<br />
|style="text-align:center;" |Switch CH4 Relay ON/OFF <br />
|-<br />
|style="text-align:center;"|5<br />
|style="text-align:center;"|0x35<br />
|style="text-align:center;" |Switch CH5 Relay ON/OFF <br />
|-<br />
|style="text-align:center;"|6<br />
|style="text-align:center;"|0x36<br />
|style="text-align:center;" |Switch CH6 Relay ON/OFF <br />
|-<br />
|style="text-align:center;"|7<br />
|style="text-align:center;"|0x37<br />
|style="text-align:center;" |ALL Relays ON<br />
|-<br />
|style="text-align:center;"|8<br />
|style="text-align:center;"|0x38<br />
|style="text-align:center;" |ALL Relays OFF<br />
|}<br />
<!-- **********************注释<br />
<div class="toccolours " style="visibility: hidden;"><br />
=== 蓝牙获取IP ===<br />
</div><br />
********************** --><br />
<br />
=== Software Operation (Get IP) ===<br />
* Use [https://www.nordicsemi.com/Products/Development-tools/nRF-Connect-for-mobile nRF Connect] on your phone to control relays (or you can use other Bluetooth Debugging Assistants).<br />
* Take nRF Connect as an example: <br />
* Bluetooth name: ESP32 S3 Relay 6CH.<br />
[[File:ESP32-S3-Relay-6CH TO Bluetooth 1.png |800px]]<br />
* After connection, select "Unknown Service", and click to read the data. If the WIFI fails to be connected for a long time, the RGB turns red and is always on. This step does not have any response. <br />
* After connecting to WIFI, it receives the device IP. As shown below, '''the Device IP is 192.168.6.133.'''<br />
[[File:ESP32-S3-Relay-6CH TO Bluetooth 2.png |800px]]<br />
*Bluetooth control relay commands are characters 1~8, i.e. hexadecimal '''0x31 ~ 0x38'''.<br />
*Click to send, Fill in the data to be sent (currently sent in hexadecimal), and enter 0x31 as follows:<br />
[[File:ESP32-S3-Relay-6CH TO Bluetooth 3.png |800px]]<br />
*Send "0x31" to switch the CH1 relay status.<br />
[[File:ESP32-S3-Relay-6CH TO Bluetooth 4.png |800px]]<br />
*Send "0x38" to control all relays OFF. <br />
[[File:ESP32-S3-Relay-6CH TO Bluetooth 5.png |800px]]<br />
<br />
== Waveshare Cloud Platform==<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
=== Create Device===<br />
<div class="mw-collapsible-content"><br />
<br />
* Login to [https://waveshare.cloud/#/login Waveshare cloud].<br />
[[File:ESP32-S3-Relay-6CH TO Cloud 0.png]]<br />
* Then, enter "Devices Manager" to "Add New" device, and fill in "Type Name": <br />
[[File:ESP32-S3-Relay-6CH TO Cloud 1.png]]<br />
* Created successfully. <br />
[[File:ESP32-S3-Relay-6CH TO Cloud 73.png]]<br />
* Create another device by "One-click Add" -> "Common Device Template".<br />
[[File:ESP32-S3-Relay-6CH TO Cloud 2.png]]<br />
*Edit "Enter Device Name":<br />
[[File:ESP32-S3-Relay-6CH TO Cloud 3.png]]<br />
*Click to edit and select its device type. <br />
[[File: ESP32-S3-Relay-6CH TO Cloud 4.png]]<br />
[[File:ESP32-S3-Relay-6CH TO Cloud 6.png]]<br />
*Save.<br />
[[File:ESP32-S3-Relay-6CH TO Cloud 7.png]]<br />
*Then, you can see the device details. [[#Demo Modify| Modify ]] '''WS_Information.h''' file and [[#Demo Flash| Flash]] it before using.<br />
[[File: ESP32-S3-Relay-6CH TO Burnprocess 4.png]]<br />
</div></div><br />
<br />
=== Software Operation ===<br />
* Go to Waveshare website, and enter '''[https://waveshare.cloud/#/login Waveshare Cloud]'''.<br />
* Complete registration and '''[[#Create Device | create]] device.'''<br />
* '''[[#Demo Modify | Modify]]''' the corresponding MQTT and WIFI. <br />
* '''[[#Demo Flash | Demo Flash]].'''<br />
* Enter "Dashboard" on the Waveshare Cloud Platform. <br />
[[File:ESP32-S3-Relay-6CH TO Waveshare cloud 1.png]]<br />
*Select the corresponding device option and "Enter Development". <br />
[[File:ESP32-S3-Relay-6CH TO Waveshare cloud 2.png]]<br />
<br />
= External Expansion =<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
== RS485 Expand Relay Channel ==<br />
<div class="mw-collapsible-content"><br />
* Use [https://www.waveshare.com/modbus-rtu-relay.htm Modbus RTU Relay] to expand the 8-ch relay. <br />
* The 4 main example files are already compatible with this operation, you need to set Extension_Enable in WS_imformation.h to 1 (default is 1).<br />
* Externally expandable relays can be controlled via Bluetooth.<br />
:{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;"|Operation Command<br />
|style="background:green; color:white;text-align:center;" |Function<br />
|-<br />
|style="text-align:center;"|06 01<br />
|style="text-align:center;" |Switch Modbus RTU Relay's CH1 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 02<br />
|style="text-align:center;" |Switch Modbus RTU Relay's CH2 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 03<br />
|style="text-align:center;" |Switch Modbus RTU Relay's CH3 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 04<br />
|style="text-align:center;" |Switch Modbus RTU Relay's CH4 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 05<br />
|style="text-align:center;" |Switch Modbus RTU Relay's CH5 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 06<br />
|style="text-align:center;" |Switch Modbus RTU Relay's CH6 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 07<br />
|style="text-align:center;" |Switch Modbus RTU Relay's CH7 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 08<br />
|style="text-align:center;" |Switch Modbus RTU Relay's CH8 relay ON/OFF <br />
|-<br />
|style="text-align:center;"|06 09<br />
|style="text-align:center;" |ALL Modbus RTU Relay's relays ON<br />
|-<br />
|style="text-align:center;"|06 0A<br />
|style="text-align:center;" |ALL Modbus RTU Relay's relays OFF<br />
|}<br />
<br />
* Bluetooth control relay commands are characters 1~8, i.e. hexadecimal '''0x06 0x01 ~ 0x38 0x0A'''.<br />
* Click to send, fill in the data to be sent (currently sent in hexadecimal), as follows: 0x06 0x01.<br />
[[File:ESP32-S3-Relay-6CH TO Extension 1.png|1000px]]<br />
* Send 0x06 0x01 to switch CH1 relay status. <br />
[[File:ESP32-S3-Relay-6CH TO Extension 2.png|1000px]]<br />
* Send 0x06 0x0A to switch all relays OFF. <br />
[[File:ESP32-S3-Relay-6CH TO Extension 3.png|1000px]]<br />
</div></div><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
== Working with Pico for Timer Switch Function ==<br />
<div class="mw-collapsible-content"><br />
* Use [https://www.waveshare.com/pico-rtc-ds3231.htm Pico-RTC-DS3231] to expand the timer switch function. <br />
* Connect [https://www.waveshare.com/pico-rtc-ds3231.htm Pico-RTC-DS3231] to ESP32-S3-Relay-6CH.<br />
* This function is compatible with the above four main sample files. What you need to do is to set "RTC_Enable" in "WS_imformation.h" to 1 (The default setting is 0, please make sure it is connected to [https://www.waveshare.com/pico-rtc-ds3231.htm Pico-RTC-DS3231]).<br />
[[File:ESP32-S3-Relay-6CH TO Extension2 1.png]]<br />
*Set RTC_OPEN_Time_Hour, RTC_OPEN_Time_Min, RTC_Close_Time_Hour, and RTC_Close_Time_Min in WS_imformation.h to the operation time.<br />
:{|border=1; style="width:700px;" align="auto"<br />
|-<br />
|style="background:green; color:white;text-align:center;"|Items<br />
|style="background:green; color:white;text-align:center;" |Parameters<br />
|-<br />
|style="text-align:center;"|RTC_OPEN_Time_Hour<br />
|style="text-align:center;" |All relay timing switch ON --in hours<br />
|-<br />
|style="text-align:center;"|RTC_OPEN_Time_Min<br />
|style="text-align:center;" |All relay timing switch ON --in minutes <br />
|-<br />
|style="text-align:center;"|RTC_Closs_Time_Hour<br />
|style="text-align:center;" |All relay timing switch OFF --in hours <br />
|-<br />
|style="text-align:center;"|RTC_Closs_Time_Min<br />
|style="text-align:center;" |All relay timing switch OFF --in minutes <br />
|}<br />
* The following settings are set to open at 08:06 and close at 16:30 every day.<br />
[[File:ESP32-S3-Relay-6CH TO Extension2 2.png]]<br />
</div></div><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
==Working with Pico for Expanding CAN Interface==<br />
<div class="mw-collapsible-content"><br />
*Use [https://www.waveshare.com/pico-can-b.htm Pico-CAN-B] to expand the CAN communication interface. <br />
=== Preparation ===<br />
* Install mcp_can library:<br />
[[File:ESP32-S3-Relay-6CH TO Extension3 1.png]]<br />
=== User Manual ===<br />
* Connect [https://www.waveshare.com/pico-can-b.htm Pico-CAN-B] to ESP32-S3-Relay-6CH.<br />
* Download [https://files.waveshare.com/wiki/ESP32-S3-Relay-6CH/ESP32_S3_Relay_6CH_Demo_For_Pico_CAN_B.zip Pico-CAN-B example demo].<br />
* Modify it according to your needs. <br />
[[File:ESP32-S3-Relay-6CH TO Extension3 2.png]]<br />
* In the main program, use the files WS_MCP2515.c and WS_MCP2515.h.<br />
* After including, you can use the functions '''receiveCANData(uint32_t* canId, uint8_t* data)''' and '''sendCANData(uint32_t canId, uint8_t len, uint8_t* data)''' in the main program for CAN data receiving and CAN data sending, respectively.<br />
[[File:ESP32-S3-Relay-6CH TO Extension3 3.png]]<br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
== Working with Pico For Expanding Environment Monitoring Function ==<br />
<div class="mw-collapsible-content"><br />
*Use [https://www.waveshare.com/pico-environment-sensor.htm Pico-Environment-Sensor] to expand the environment monitoring function. <br />
=== Preparation ===<br />
*Install Adafruit BME280 library:<br />
[[File:ESP32-S3-Relay-6CH TO Extension4 1.png]]<br />
*Select "Install All".<br />
[[File:ESP32-S3-Relay-6CH TO Extension4 2.png]]<br />
* Install Adafruit TSL2591 library:<br />
[[File:ESP32-S3-Relay-6CH TO Extension4 3.png]]<br />
*Select "Install All":<br />
[[File:ESP32-S3-Relay-6CH TO Extension4 4.png]]<br />
*Install Adafruit LTR390 library:<br />
[[File:ESP32-S3-Relay-6CH TO Extension4 8.png]]<br />
*Select "Install All":<br />
[[File:ESP32-S3-Relay-6CH TO Extension4 9.png]]<br />
* Install Adafruit SGP40 library:<br />
[[File:ESP32-S3-Relay-6CH TO Extension4 10.png]]<br />
*Select "Install All":<br />
[[File:ESP32-S3-Relay-6CH TO Extension4 11.png]]<br />
<br />
=== How to Use===<br />
* Connect the [https://www.waveshare.com/pico-environment-sensor.htm Pico-Environment-Sensor] to ESP32-S3-Relay-6CH.<br />
* Download [https://files.waveshare.com/wiki/ESP32-S3-Relay-6CH/ESP32_S3_Relay_6CH_Demo_For_Pico_Environment_Sensor.zip Pico-Environment-Sensor sample demo]<br />
* Use Environment_Sensor.c and Environment_Sensor.h in the main demo.<br />
* Modify it according to your needs. <br />
</div></div><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
== Working with Pico for Expanding RS232/RS485 Interface==<br />
<div class="mw-collapsible-content"><br />
* <font color="red">Please note that when using the [https://www.waveshare.com/pico-2ch-rs485.htm Pico-2CH-RS485 Pico-2CH-RS485] expansion with the RS485 interface, only Channel 1 is supported. Channel 0 is not available for use.</font><br />
* Use [https://www.waveshare.com/pico-2ch-rs232.htm Pico-2CH-RS232] to expand RS232 interface.<br />
=== How to Use ===<br />
* Connect [https://www.waveshare.com/pico-2ch-rs232.htm Pico-2CH-RS232] or [https://www.waveshare.com/pico-2ch-rs485.htm Pico-2CH-RS485] to ESP32-S3-Relay-6CH.<br />
* Download [https://files.waveshare.com/wiki/ESP32-S3-Relay-6CH/ESP32_S3_Relay_6CH_Demo_For_Pico_2CH_RS232.zip Pico-2CH-RS232 example demo] (This demo is compatible with the above two devices.)<br />
* In the main program, include the file WS_UART.h. <br />
* Modify it according to your needs. <br />
[[File:ESP32-S3-Relay-6CH TO Extension5 1.png]]<br />
* Currently, real-time printing of received characters is enabled. <br />
* Call the initialization function Extension_Init() in setup. <br />
* After calling, you can send data in the main loop demo using the functions '''SetData2(uint8_t* data, size_t length)''' and '''SetData3(uint8_t* data, size_t length)'''.<br />
[[File:ESP32-S3-Relay-6CH TO Extension5 2.png]]<br />
</div></div><br />
<br />
= Use in Micropython =<br />
== Flash Firmware Demo ==<br />
* Download and unzip the MicroPython firmware: ([https://files.waveshare.com/wiki/ESP32-S3-Relay-6CH/ESP32-S3-Relay-6CH-Flash.zip Flash Tool and Firmware]).<br><br />
* Connect the device to the PC.<br />
* Open the '''flash_download_tool_3.9.4.exe''' software, select ESP32-S3 and USB.<br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Firmware 1.png]]<br />
* Select the corresponding COM port, as we have configured it well, just click on "Start" to download. (If the COM port is not identified, please press and hold the BOOT key, press the RESET key, and then release the RESET key first.)<br><br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Firmware 2.png]]<br />
*If you encounter downloading delays or the download doesn't start, you can enter Download mode, press and hold the BOOT key while pressing the RESET key, and then release the RESET key first. Try to download it again, and wait for the flashing process to complete.<br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Firmware 3.png]]<br />
<br />
== Preparation ==<br />
=== Use in Thonny ===<br />
* Install Thonny ([https://thonny.org/ Thonny IDE])<br><br />
* Open Thonny, click on "Python x.x.x" at the bottom right, and select "Configure interpreter".<br><br />
[[File: ESP32-S3-Relay-6CH TO MicroPython Environment 1.png]]<br />
* In the popup window, select "Interpreter" -> Interpreter select "MicroPython (ESP32)".<br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Environment 2.png]]<br />
* Save the setting.<br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Environment 3.png]]<br />
* Click to stop, Shell window appears "MicroPython v1.22.1 on 2024-01-05; using the generic ESP32S3 module for ESP32S3 Enter "help()" for more information." this means the connection is successful.<br><br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Environment 4.png]]<br />
<br />
== Demo Upload ==<br />
* Download [https://files.waveshare.com/wiki/ESP32-S3-Relay-6CH/ESP32-S3-Relay-6CH-MicroPython.zip MicroPython sample demo].<br />
* Open the file window on Thonny.<br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Environment 5.png]]<br />
* View the file path.<br><br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Environment 6.png]]<br />
* Enter Path to the example file on Thonny.<br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Environment 7.png]]<br />
* Press Ctrl and select all files on the directory, right-click and select "Upload to/".<br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Environment 8.png]]<br />
* Waiting for the uploading to finish.<br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Environment 9.png]]<br />
* After uploading successfully, you can see the uploaded file in the device file window, the current example can realize the Bluetooth control device, according to their own needs for secondary development.<br />
[[File:ESP32-S3-Relay-6CH TO MicroPython Environment 10.png]]<br />
<br />
=Resource=<br />
==Schematic==<br />
*[https://files.waveshare.com/wiki/ESP32-S3-Relay-6CH/ESP32-S3-Relay-6CH-Sch.pdf Schematic Diagram]<br />
<br />
==Document==<br />
*[https://files.waveshare.com/upload/b/bd/Esp32-s3_datasheet_en.pdf Esp32-s3 Datasheet]<br />
*[https://files.waveshare.com/upload/1/11/Esp32-s3_technical_reference_manual_en.pdf Esp32-s3 Technical Reference Manual]<br />
*[https://files.waveshare.com/upload/8/87/Esp32-s3-wroom-1_wroom-1u_datasheet_en.pdf Esp32-s3 Wroom-1 Wroom-1U Datasheet]<br />
<br />
==Demo==<br />
*[https://files.waveshare.com/wiki/ESP32-S3-Relay-6CH/ESP32_S3_Relay_6CH_Demo.zip Demo]<br />
==Software==<br />
*[https://files.waveshare.com/wiki/ESP32-S3-Relay-6CH/SSCOM5.13.1_For_ESP32_S3_Relay_6CH.zip SSCOM]<br />
*[https://www.nordicsemi.com/Products/Development-tools/nRF-Connect-for-mobile nRF Connect]<br />
<br />
=FAQ=<br />
{{FAQ|When controlling other devices using RS485, it's unresponsive or communication fails?<br />
|<br />
Please try moving the jumper cap to the 120R position. Some RS485 devices require a 120-ohm resistor to be connected in series for proper communication.<br />
||}}<br />
{{FAQ|After downloading the program onto the module, sometimes there might be issues with reconnecting to the serial port or failed burning when attempting to re-download?<br />
|<br />
To resolve most of the download issues, you can try the following steps:<br><br />
Long press the BOOT button.<br><br />
While keeping the BOOT button pressed, press the RESET button.<br><br />
Release the RESET button first, then release the BOOT button.<br><br />
This will put the module into download mode and should resolve most download issues.<br />
||}}<br />
=Support=<br />
{{Servicebox1}}</div>Eng52