Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate On-Robot Diagnostics Through a CLI Interface #3379

Open
wants to merge 41 commits into
base: master
Choose a base branch
from

Conversation

Andrewyx
Copy link
Contributor

@Andrewyx Andrewyx commented Oct 20, 2024

Flash with bazel run //software/embedded/ansible:run_ansible --platforms=//cc_toolchain:robot --//software/embedded:host_platform=PI -- --playbook deploy_robot_software.yml --hosts 10.42.0.53 --ssh_pass thunderbots

Please fill out the following before requesting review on this PR

Description

Introduces new CLI tool with robot_diagnostics_features to run natively on robots. This feature will be usuable while SSH'ed into the robot and includes simplified commands ported over from the robot_diagnostics GUI.

Testing Done

Resolved Issues

Resolves #3322

Length Justification and Key Files to Review

Review Checklist

It is the reviewers responsibility to also make sure every item here has been covered

  • Function & Class comments: All function definitions (usually in the .h file) should have a javadoc style comment at the start of them. For examples, see the functions defined in thunderbots/software/geom. Similarly, all classes should have an associated Javadoc comment explaining the purpose of the class.
  • Remove all commented out code
  • Remove extra print statements: for example, those just used for testing
  • Resolve all TODO's: All TODO (or similar) statements should either be completed or associated with a github issue

Copy link
Contributor

@itsarune itsarune left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looking great! excited to see how this turns out

@Mr-Anyone
Copy link
Contributor

Mr-Anyone commented Nov 12, 2024

fire branch with clean git diff.

Comment on lines 1 to +2
bazel_dep(name = "bazel_skylib", version = "1.5.0")
bazel_dep(name = "rules_pkg", version = "1.0.1")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should delete this file. We probably aren't going to move to MODULE.bazel anytime soon.

I remember reading an issues about rules_python and pybind11_protobuf having issues with locally installed toolchain somewhere when using MODULE.

@@ -0,0 +1,97 @@
load("@rules_pkg//:providers.bzl", "PackageFilesInfo", "PackageSymlinkInfo", "PackageFilegroupInfo")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

interesting.

This could be the next nanpb_proto_library in our codebase:

load("@rules_proto//proto:defs.bzl", "ProtoInfo")
load("@bazel_tools//tools/cpp:toolchain_utils.bzl", "find_cpp_toolchain")
# The relative filename of the header file.
_FILENAME = "lib/{dirname}/{filename}"
_PROTO_DIR = "{path}/proto"
_OPTIONS_DIR = "{path}/generator/nanopb"
# Command that copies the source to the destination.
_COPY_COMMAND = "cp {source} {destination}"
# Command that zips files recursively. It enters the output directory first so
# that the zipped path starts at lib/.
_ZIP_COMMAND = "cd {output_dir} && zip -qq -r -u {zip_filename} lib/"
def _nanopb_proto_library_impl(ctx):
# This is the folder we will place all our generation artifacts in
#
# Because we need to generate the `.c` and `.h` files for all the proto files required
# for this library, including all the transitive dependencies, we would like to
# generate these in the same folder as each proto file. However bazel will not
# permit us to modify/produce files outside the folder that this rule is called from.
# We get around this be reproducing the folder structure under a generation folder
# and adding the root of this directory as an include path.
#
generation_folder_name = ctx.attr.name + "_nanopb_gen/"
generated_folder_abs_path = ctx.genfiles_dir.path + "/" + \
ctx.build_file_path[:-len("BUILD")] + generation_folder_name
# Generate import flags for the protobuf compiler so it can find proto files we
# depend on, and a list of proto files to include
all_proto_files = depset()
all_proto_include_dirs = depset()
for dep in ctx.attr.deps:
all_proto_files = depset(
transitive = [all_proto_files, dep[ProtoInfo].transitive_sources],
)
all_proto_include_dirs = depset(
transitive = [all_proto_include_dirs, dep[ProtoInfo].transitive_proto_path],
)
all_proto_hdr_files = []
all_proto_src_files = []
for proto_file in all_proto_files.to_list():
# Generate the `.pb` files using the protobuf compiler
pb_file_name = generation_folder_name + proto_file.path[:-len(".proto")] + ".pb"
pb_file = ctx.actions.declare_file(pb_file_name)
# Create the arguments for the proto compiler to compile the proto file,
# adding all the transitive include directories
proto_compile_args = ["-o", pb_file.path, proto_file.path]
for path in all_proto_include_dirs.to_list():
proto_compile_args += ["-I%s" % path]
ctx.actions.run(
inputs = all_proto_files,
outputs = [pb_file],
arguments = proto_compile_args,
executable = ctx.executable.protoc,
mnemonic = "ProtoCompile",
use_default_shell_env = True,
)
# Generate the equivalent C code using Nanopb
h_out_name = generation_folder_name + proto_file.path[:-len(".proto")] + ".nanopb.h"
c_out_name = generation_folder_name + proto_file.path[:-len(".proto")] + ".nanopb.c"
c_out = ctx.actions.declare_file(c_out_name)
h_out = ctx.actions.declare_file(h_out_name)
ctx.actions.run_shell(
tools = [ctx.executable.nanopb_generator],
inputs = [pb_file],
outputs = [c_out, h_out],
mnemonic = "NanopbGeneration",
command = "%s -e .nanopb %s" % (ctx.executable.nanopb_generator.path, pb_file.path),
)
all_proto_src_files.append(c_out)
all_proto_hdr_files.append(h_out)
cc_toolchain = find_cpp_toolchain(ctx)
feature_configuration = cc_common.configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain,
)
# Get the compilation and linking contexts from all nanopb srcs
nanopb_compilation_contexts = [
label[CcInfo].compilation_context
for label in ctx.attr.nanopb_libs
if label[CcInfo].compilation_context != None
]
nanopb_linking_contexts = [
label[CcInfo].linking_context
for label in ctx.attr.nanopb_libs
if label[CcInfo].linking_context != None
]
(compilation_context, compilation_outputs) = cc_common.compile(
name = "compile_nanopb_outputs",
actions = ctx.actions,
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain,
srcs = all_proto_src_files,
public_hdrs = all_proto_hdr_files,
includes = [
generated_folder_abs_path,
] + [generated_folder_abs_path + dir for dir in all_proto_include_dirs.to_list()],
compilation_contexts = nanopb_compilation_contexts,
)
(linking_context, linking_outputs) = \
cc_common.create_linking_context_from_compilation_outputs(
name = "link_nanopb_outputs",
compilation_outputs = compilation_outputs,
actions = ctx.actions,
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain,
linking_contexts = nanopb_linking_contexts,
)
# platformio_* bazel rules require a provider named transitive_zip_files and an output zip file
# these contain all files needed for compilation with platformio.
name = ctx.label.name
commands = []
inputs = all_proto_hdr_files + all_proto_src_files
outputs = []
for hdr_file in all_proto_hdr_files:
dir = _PROTO_DIR.format(path = name)
if "options.nanopb." in hdr_file.basename:
dir = _OPTIONS_DIR.format(path = name)
file = ctx.actions.declare_file(
_FILENAME.format(dirname = dir, filename = hdr_file.basename),
)
outputs.append(file)
commands.append(_COPY_COMMAND.format(
source = hdr_file.path,
destination = file.path,
))
for src_file in all_proto_src_files:
dir = _PROTO_DIR.format(path = name)
if "options.nanopb." in src_file.basename:
dir = _OPTIONS_DIR.format(path = name)
file = ctx.actions.declare_file(
_FILENAME.format(dirname = dir, filename = src_file.basename),
)
outputs.append(file)
commands.append(_COPY_COMMAND.format(
source = src_file.path,
destination = file.path,
))
outputs.append(ctx.outputs.zip)
commands.append(_ZIP_COMMAND.format(
output_dir = ctx.outputs.zip.dirname,
zip_filename = ctx.outputs.zip.basename,
))
ctx.actions.run_shell(
inputs = inputs,
outputs = outputs,
command = "\n".join(commands),
)
return struct(
transitive_zip_files = depset([ctx.outputs.zip]),
providers = [CcInfo(
compilation_context = compilation_context,
linking_context = linking_context,
)],
)
nanopb_proto_library = rule(
implementation = _nanopb_proto_library_impl,
outputs = {
"zip": "%{name}.zip", #output needed to be included in platformio_library
},
attrs = {
"deps": attr.label_list(
mandatory = True,
providers = [
ProtoInfo,
],
),
"nanopb_libs": attr.label_list(
providers = [CcInfo],
default = [Label("@nanopb//:nanopb_header")],
),
"nanopb_generator": attr.label(
executable = True,
cfg = "host",
default = Label("@nanopb//:nanopb_generator"),
),
"protoc": attr.label(
executable = True,
cfg = "host",
default = Label("@com_google_protobuf//:protoc"),
),
"_cc_toolchain": attr.label(
default = Label("@bazel_tools//tools/cpp:current_cc_toolchain"),
),
},
provides = [
CcInfo,
],
toolchains = ["@bazel_tools//tools/cpp:toolchain_type"],
fragments = ["cpp"],
host_fragments = ["cpp"],
)

@Mr-Anyone
Copy link
Contributor

looking good.

Copy link
Contributor

@itsarune itsarune left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great work so far with this refactor. I think it's coming together and don't have any big feedback

@Andrewyx Andrewyx marked this pull request as ready for review January 31, 2025 21:11
Comment on lines +27 to +29
ESTOP_PATH_1 = "/dev/ttyACM0"
ESTOP_PATH_2 = "/dev/ttyUSB0"
ESTOP_PATH_3 = "/dev/ttyUSB1"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can use udev rules (https://www.reactivated.net/writing_udev_rules.html#basic) to make the estop appear as /dev/estop

Comment on lines +129 to +141
self.receive_robot_status = tbots_cpp.RobotStatusProtoListener(
str(getRobotMulticastChannel(
self.channel_id)) + "%" + f"{self.embedded_data.get_network_interface()}",
ROBOT_STATUS_PORT,
self.__receive_robot_status,
True,
)

# Multicast Sender
self.primitive_set_sender = tbots_cpp.PrimitiveSetProtoUdpSender(
str(getRobotMulticastChannel(
self.channel_id)) + "%" + f"{self.embedded_data.get_network_interface()}", PRIMITIVE_PORT, True
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fyi the networking interface changed for this on upstream

Comment on lines +58 to +61
self.epoch_timestamp_seconds = robot_status.time_sent.epoch_timestamp_seconds
self.battery_voltage = robot_status.power_status.battery_voltage
self.primitive_packet_loss_percentage = robot_status.network_status.primitive_packet_loss_percentage
self.primitive_executor_step_time_ms = robot_status.thunderloop_status.primitive_executor_step_time_ms
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

have you considered keeping this state in self.embedded_data so that all the data is in one place?

Comment on lines +47 to +57
def __clamp(self, val: float, min_val: float, max_val: float) -> float:
"""Simple Math Clamp function (Faster than numpy & fewer dependencies)
:param val: Value to clamp
:param min_val: Minimum (Lower) Bound
:param max_val: Maximum (Upper) Bound
"""
return min(max(val, min_val), max_val)

def get_rotate_primitive(self, velocity: float) -> Primitive:
"""Prepares and returns the processed direct control primitive given a velocity
:param velocity: Angular Velocity to rotate the robot
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the helpers in the file get_rotate_primitive and the other get_xxx_primitive functions aren't really data. Maybe putting these helpers in a different file embedded_control.py may be better

logging.exception(f"Unknown Exception: {e}")
raise Typer.Exit(code=exit_code)
finally:
self.embedded_communication.send_primitive_set(Primitive(stop=StopPrimitive()))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

with this finally clause, do you even need to send stop primitives in the other cases?

:param velocity: Velocity to rotate the wheel
:param duration_seconds: Duration to move
"""
# TODO: Confirm max speed for wheel rotation (it is currently net robot velocity)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

resolve TODO or make an issue

description
)

def emote(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make github issue

@@ -222,6 +222,7 @@ void Thunderloop::runLoop()
network_status_.set_ms_since_last_primitive_received(
getMilliseconds(time_since_last_primitive_received));

// LOG(INFO) << "PRIMITIVE INFO: " << primitive_set_.DebugString();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete

Copy link
Contributor

@itsarune itsarune left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

left some feedback

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Investigate on-robot diagnostics through a CLI interface
3 participants