To assess the operational performance of the Raspberry Pi 5 during LLM inference, we collected various metrics while conducting inference tests with 1, 2, 4, or 8 users simultaneously and for each case using 1, 2, 3, or 4 threads. Here are the results of these measurements for each dimension and the accompanying analysis.