Skip to content

Commit

Permalink
Rebase to HEAD (#34)
Browse files Browse the repository at this point in the history
Rebase branch to HEAD to start restructure of home docs
  • Loading branch information
afr2903 authored Oct 14, 2024
2 parents 15e357a + 7308e7d commit b4e900e
Show file tree
Hide file tree
Showing 265 changed files with 1,786 additions and 39,019 deletions.
25 changes: 25 additions & 0 deletions .github/CODEOWNERS
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@

# @Home
/docs/home/ @RoBorregos/home-integracion

/docs/home/Areas/Navigation.md @RoBorregos/home-nav
/docs/home/Areas/HRI.md @RoBorregos/home-hri
/docs/home/Areas/Integration\ and\ Networks.md @RoBorregos/home-integracion
/docs/home/Areas/Mechanics.md @RoBorregos/home-mecanica
/docs/home/Areas/Computer\ Vision.md @RoBorregos/home-vision
/docs/home/Areas/Manipulation.md @RoBorregos/home-manipulacion
/docs/home/Areas/Electronics\ and\ Control.md @RoBorregos/home-manipulacion

/docs/home/Aug\ 2022\ -\ Jun\ 2023/Human\ Robot\ Interaction/ @RoBorregos/home-hri
/docs/home/Aug\ 2022\ -\ Jun\ 2023/Mechanics/ @RoBorregos/home-mecanica
/docs/home/Aug\ 2022\ -\ Jun\ 2023/Integration\ and\ Networks/ @RoBorregos/home-integracion
/docs/home/Aug\ 2022\ -\ Jun\ 2023/Electronics\ and\ Control/ @RoBorregos/home-electronica
/docs/home/Aug\ 2022\ -\ Jun\ 2023/Computer\ Vision/ @RoBorregos/home-vision

/docs/home/Aug\ 2023\ -\ Jun\ 2024/Human\ Robot\ Interaction/ @RoBorregos/home-hri
/docs/home/Aug\ 2023\ -\ Jun\ 2024/Mechanics/ @RoBorregos/home-mecanica
/docs/home/Aug\ 2023\ -\ Jun\ 2024/Integration\ and\ Networks/ @RoBorregos/home-integracion
/docs/home/Aug\ 2023\ -\ Jun\ 2024/Electronics\ and\ Control/ @RoBorregos/home-electronica
/docs/home/Aug\ 2023\ -\ Jun\ 2024/Computer\ Vision/ @RoBorregos/home-vision
/docs/home/Aug\ 2023\ -\ Jun\ 2024/Manipulation/ @RoBorregos/home-manipulacion
/docs/home/Aug\ 2023\ -\ Jun\ 2024/Navigation/ @RoBorregos/home-nav
23 changes: 23 additions & 0 deletions .github/workflows/format.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
name: Test Format

on:
push:
branches: [ main ]
pull_request:
branches: [ main ]

jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.x'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements_test.txt
- name: Test with pytest
run: pytest
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
site/
__pycache__/
25 changes: 19 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Welcome to the RoBorregos Official documentation. This documentation is based on
## Add new page

To add a new page, locate the docs directory.
```{bash}
```bash
ROBORREGOS-DOCS
│ mkdocs.yml
│ requirements.txt
Expand All @@ -37,25 +37,38 @@ To add new images, add them to the assets folder. Preferably, use the same name
To run the documentation locally, you need to have python installed.

1. Clone the repository
```{bash}
```bash
git clone https://github.com/RoBorregos/RoBorregos-Docs.git
```

2. Install the requirements
```{bash}
```
pip install -r requirements.txt
```

3. Run the server
```{bash}
```bash
mkdocs serve
```

4. Open the browser and go to http://localhost:8000
4. If you encounter issues with the command not being found try the following
```bash
python -m mkdocs serve
```

5. Open the browser and go to http://localhost:8000

## Test
Run formatting tests
```bash
pip install -r requirements_test.txt
# At root directory of project
pytest
```

## Deploy
Pls do not deploy without permission from the repo mantainer.
```{bash}
```bash
mkdocs gh-deploy
```

File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -58,4 +58,4 @@ cv2.destroyAllWindows()
```


![ArUco Detector using webcam](../../assets/LARC/arucos.jpg)
![ArUco Detector using webcam](/assets/LARC/arucos.jpg)
Original file line number Diff line number Diff line change
Expand Up @@ -81,4 +81,4 @@ cv2.destroyAllWindows()



![Color detector image](../../assets/LARC/colores.png)
![Color detector image](/assets/LARC/colores.png)
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ To install Tensorflow Litte you need to run the following command:
pip install tflite-model-maker
```
### Datastet structure
![dataset structure](../../assets/LARC/dataset.png)
![dataset structure](/assets/LARC/dataset.png)

### Usage

Expand Down Expand Up @@ -107,4 +107,4 @@ print(data[max])

```

![F](../../assets/LARC/F.png) ![Stats](../../assets/LARC/stats.png)
![F](/assets/LARC/F.png) ![Stats](/assets/LARC/stats.png)
File renamed without changes.
20 changes: 20 additions & 0 deletions docs/LARC/2023/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# @RescueMaze - 2023

The main developments during 2023 with respect to previous years are the following:


TODO: modify this.
## Mechanics

- [Jetson Nano](Jetson Nano/RunningJetson/)
- d

## Electronics

- [Jetson Nano](Jetson Nano/RunningJetson/)
-

## Programming

- [Jetson Nano](Jetson Nano/RunningJetson/)
- d
48 changes: 48 additions & 0 deletions docs/LARC/2024/Electronics/Description.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
## Introduction

For all the devices that would be used in the final product to interact with the computer, the use of a microcontroller and a microprocessor is necessary. It was decided that an Arduino MEGA 2560 would be the chosen microcontroller for this purpose, since it has a worldwide platform that fulfills the connections that are needed for all the devices that were mentioned in the index. In addition to this, the Arduino MEGA 2560 has a wide number of digital inputs, as well as PWM inputs, which supports the final decision to implement it on the robot. A microprocessor, the Raspberry Pi 4, was used so that the robot could detect the packages through the cameras, and with the purpose that both the Arduino and the Raspberry Pi can communicate bilaterally.

![Arduino MEGA](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/3f3de84f-15eb-448e-8907-049563494433)
![Raspberry Pi 4](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/e7edb739-1892-431f-aaee-357dc9fa80dc)

## Connections

The connections were made so that the sensors and motor drivers that were needed are connected to the Arduino, with the exception of the power supplies for the motor drivers, which are powered by different LiPO batteries. With that in mind, the first LiPO battery (11.1 volts) is powering the Arduino through a voltage regulator, which is outputting 5 volts to this microcontroller. The second LiPO battery (11.1 volts) is connected to the wheel motors, and the third one (which has the same voltage) is powering a stepper motor.

The XT60 connectors were chosen for all the input voltages since they internally have diodes, which would help protect the circuit overall in case the current went the opposite way in any moment.

![XT60 connector](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/41ac4569-05e2-41f2-a3ef-e5b3c1a0e838)

There were also three LED indicators for some voltage sources, such as the Arduino's converted supply voltage (which is 3.6V in the schematic, but was made 5V at last), the Arduino's not regulated supply voltage (the 22V line in the schematic, which is now 11.1V), and the motors' supply voltage (the 12V line, 11.1V at last).

![LED Indicators](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/78f574a7-48ff-4c49-bdfb-7e433015de9a)

There are some support devices for the components, like the push button to reset the camera. At first, an OpenMV CamH7 Plus was used, but it no longer turned on, so the team had to switch to normal USB cameras to process all the packages for the challenge. This button would be later used as a reset for the Arduino MEGA instead.

The schematic (which was designed on EasyEDA) looked like this at last:

![image](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/894fd928-ac53-453b-8de9-c4fba6a924a9)

![image](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/ba561a27-9d5b-4703-8b43-307b8e839eb6)

![image](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/a65358df-78a8-4054-9abb-a1055380100a)


## PCB Development

A software tool such as EasyEDA can develop Printed Circuit Boards (PCBs) out of electronic schematics, which opens the possibility to integrate many devices in a single board, without the need of making more physical electrical connections than the ones that are explicitly required. Plus, by making a PCB we can be reassured that there will not be any mistakes like overvoltages or short circuits.

The PCB was designed so that this board could be used modularly, in case any other challenge needed these same connections. Such decision came to mind because many of the components that are used in this PCB are basic to make an autonomous robot work. It was also decided that the PCB would have female pin headers, since it was better to attach the components just by that instead of soldering them directly to the board.

Every voltage via was handled so that no device would be overloaded by an excessive amount of voltage that it could not handle, which is the same case for the microcontroller and the microprocessor. It is important to note that the Raspberry Pi is not being contemplated in the PCB, since it is being powered by the same battery as the Arduino externally. Another fact to consider is that initially it was intended that the PCB would have three different vias: 22 volts, 12 volts and 19 volts. At last, it was inferred that the best decision would be to use three 11.1 volts batteries.

![PCB LARC 2024](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/c14e5f21-d7fe-4891-a089-c3382e36a039)

The final result in Electronics matters can be seen in this picture, which was taken the day of the competition:

![PCB IRL](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/4c70fbd9-e77a-4ee7-a554-d3e90e14f275)

## Modifications

Other modifications included adding a new voltage regulator for the gripper’s servos, since they needed at least 6 volts to operate. Thus, it was decided that the battery that would supply energy to the Arduino would also power said servos.
One of the difficulties that was encountered throughout the development of this challenge was that both the Raspberry Pi and the TMC2209 driver for the stepper were overheating, so two cooler fans were designed so this could be fixed. Both fans are powered by almost 12 volts (which are being outputted by one of the stepper motor's driver slots).
24 changes: 24 additions & 0 deletions docs/LARC/2024/Mechanics/MDF_fixtures.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
## MDF

Generally speaking, Medium-Density Fibreboard is a great material to build out a simple structure or prototype. The key aspect is its easy manufacturability and cost. However, when designing the base of a robot, that needs to be sturdy, there are some limitations. Feel free to mix and combine fixture elements such as nuts and bolts. These types of fixtures are a great choice since the forces they would generate as a response of stress would be mainly perpendicular to the material and accross all of its layers. Just as with 3D printed parts, the material is most vulnerable when applying parallel loads to some and not all of the layers of the material. Therefore, adhesive fixtures that act only on the surface are not recommended.

![image](https://github.com/RoBorregos/RoBorregos-Docs/assets/159929034/73b36774-fb9e-47f2-8582-b191168e0a8b)

## Basic Fixtures

Taking these considerations into account, to design a fixture between 2 MDF parts is simple. In this case, basic types of fixtures refers to a situation in which both parts are perpendicular one from another. Furthermore, there are 4 basic types of contact between 2 MDF parts where some kind of movement is blocked. These are divided depending on where a "plug" type part would act upon the "socket" type part, and whether or not the "socket" part covers both edges of the "plug" part. Generally speaking, the goal of an MDF structure is to block as many degrees of freedom as posible. As you can already tell, this types of fixtures can only go so far on their own. However, when used in coalition with one another, a group of these basic fixtures can do most of the job. Keep in mind that because laser cutting works so well with MDF, tolerances can be neglected. As long as you always take into account the width of the material that you will cut, you can design to your hearts content!

![image](https://github.com/RoBorregos/RoBorregos-Docs/assets/159929034/cccc199d-8ba3-4996-86dc-03f6cebddb9d)

## Half-Lap fixtures

A Half-Lap fixture happens when both parts are both "plug" and "socket", and they do not assemble perpendicular from one another, but rather through an axis parallel to both pieces. Both parts have a 'U' shape and these interlock with one another. These are particularly useful because the increased area of contact makes a greater friction force that helps maintain the pieces in place. However the main advantage of this type of fixture is that it locks movement in almost all directions. Therefore, this type of fixture works best in structural beams and as an 'adapter' between 2 pieces.

![image](https://github.com/RoBorregos/RoBorregos-Docs/assets/159929034/d73d00ba-4e89-4f1c-9a9a-8c7b121b2183)

## Mix and Match

If the goal is to use as little nuts and bolts as possible, do not be affraid to make as many components as you like. The limits are your imagination. This is a part of the LARC 2024 chassis of the robot, where the middle beam (transparent) is held in place to the wheel wall (pink outline) by a lock piece (blue) that goes into the "plug" part of the beam and locks the movement parallel to the beam. Since the surface of contact is relatively huge, the friction works so well that it even is a hazzle to disassemble it!

![image](https://github.com/RoBorregos/RoBorregos-Docs/assets/159929034/1f6dfbdd-af93-4715-90a2-d82f7933f4a5)

49 changes: 49 additions & 0 deletions docs/LARC/2024/Programming/Robot Control/Velocity Control.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
## Holonomic Robot Overview

Encompassing mechanical movements aligned with the main engine's expectations, is overseen by the microcontroller. This entails control of four motors for movement, along with respective encoders, BNO055 IMU, floor phototransistors, elevator, and gripper. The control system accommodates mecanum wheel drive capabilities spanning forward/backward, left/right, and rotation.


## Encoder for feedback

At the core of control lies the utilization of encoders to maintain uniform motor speeds, alongside continuous odometry updates for spatial consistency. The controllability of said wheel speeds depends on an implemented PID control loop for each wheel. Odometry is the driving force of this method and makes mecanum wheels viable. Given the myriad factors influencing chassis movement, such as weight distribution and mecanum wheel precision, IMU feedback supplement movement control, aiding directional precision.

## Kinematic Equations

As described below, these equations govern the direction and speed control of mecanum drive, facilitating precise movement across the field.

\[ \text{front_left} = \frac{1}{r}(v_x - v_y - (l_x + l_y)z) \]

\[ \text{front_right} = \frac{1}{r}(v_x + v_y + (l_x + l_y)z) \]

\[ \text{back_left} = \frac{1}{r}(v_x + v_y - (l_x + l_y)z) \]

\[ \text{back_right} = \frac{1}{r}(v_x - v_y + (l_x + l_y)z) \]

where

\[ v_x = \text{linear velocity in the X-axis} \]

\[ v_y = \text{linear velocity in the Y-axis} \]

\[ l_x = \text{Wheel base distance} \]

\[ l_y = \text{Wheel track distance} \]

\[ z = \text{Angular speed} \]




For autonomous driving, inverse kinematic equations are used. The equations are derived from mathematical modeling of a 4 Mecanum Wheeled Robot. These equations are used for calculating the required linear velocity for each individual motor at every instance of robot movement.


The previous equations generate linear velocity for each wheel, so that each one will be transforming linear velocity to angular movement. A linear velocity to RPM conversion is used.

\[
\text{RPM} = \frac{\omega_i \cdot 60}{\text{wheelCircumference}}
\]


Later then, Output RPM serves as input feedback for the Embedded Control System. At the robot's core, PID type controllers for each motor’s function to maintain the robot’s stability over all its trajectory. Furthermore, phototransistors and both cameras bolster precision and facilitate orientation confirmation, alongside distance measurement to objects.


32 changes: 32 additions & 0 deletions docs/LARC/2024/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# @LARC - 2024

The main developments during 2024 with respect to previous years are the following:


TODO: modify this.
## Mechanics

- [Jetson Nano](Jetson Nano/RunningJetson/)
- d

## Electronics

- Arduino MEGA 2560
- Servomotors
- Raspberry Pi 4 Model B+
- Pololu Gearmotor Encoder
- 16-Channel Analog Digital Multiplexer
- Reflectance Sensor Array: 4-Channel Analog Output QTR-HD-O4A
- XL4015 Voltage & Current Regulator
- 3 11.1V LiPo Batteries
- IR Sensor
- Stepper motor
- LEDs
- Resistors
- TMC2209 Stepper Motor Driver
- USB Cameras

## Programming

- [Jetson Nano](Jetson Nano/RunningJetson/)
- d
38 changes: 35 additions & 3 deletions docs/LARC/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,35 @@
#LARC 2023
## Latin American Robotics Competition
### Un challenge
# Latin American Robotics Competition Open Challenge

The competition context is aimed at automating an environment with a large number of packages
to be organized, Figure 1. The essence of the competition is extracted from environments such
as warehouse, product distribution center, store stock, etc.

Warehouses automation is already a reality in large companies like Amazon and Alibaba, but it
should be a reality in midsize companies soon. Think of a possible solution. Participants must
build an agile and fast robot to organize as many packages as possible in a limited time.

![image](https://github.com/RoBorregos/RoBorregos-Docs/assets/117100165/26c51e5c-5eba-4a80-ac3d-0598fa39a195)

## The goal

The robot can move freely in the scenario but cannot collide or push a package out of the
package's area. To reach the challenges of the competition, the robot must take each package
and leave it to its destination. The robot will not know his initial position in the scenario either the
position of the packs in the package's area. The objective is to take packages from a specific
location and take them to predefined locations, so that, at the end, the packages are in a desired
arrangement in the proposed scenario. The specific objectives are:
1. Take colored packages (yellow, red, green, blue) and move them to the unloading regions
with equivalent colors.
2. Pick up packages containing 2D codes and move them to any respective position on the
shelves.
3. Pick up packages with alphabetical values and take them to any respective position on
the shelves.

## Packages

Packages can be marked by color, 2D codes or alphabetical values. The possible colors for the
packages are green, yellow, blue and red. The 2D code is a bi-dimensional representation
containing 9 combinations, from one to 9, according to the markers that can be obtained from the
site https://chev.me/arucogen. The alphabetical packages are white and the 2D code packages
are black. There is a specific region in the scenario where the packages are initially positioned,
called the loading region.
Loading

0 comments on commit b4e900e

Please sign in to comment.