-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/improved event logging #2214
Conversation
Warning Rate limit exceeded@essweine has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 7 minutes and 26 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (1)
📝 WalkthroughWalkthroughThis pull request introduces enhancements to logging and metadata handling across multiple services in the SpiffWorkflow backend. The changes focus on improving event logging, metadata extraction, and configuration management. A new configuration variable for event stream source is added, and several services are modified to provide more detailed logging, including changes to log formatting, event tracking, and metadata storage. The modifications aim to provide more comprehensive and flexible logging capabilities while simplifying some existing code structures. Changes
Sequence DiagramsequenceDiagram
participant WES as Workflow Execution Service
participant PS as Process Model Service
participant LS as Logging Service
participant PIP as Process Instance Processor
WES->>PS: extract_metadata()
PS-->>WES: return metadata
WES->>LS: log_event(message, log_extras)
WES->>PIP: store_metadata(metadata)
PIP-->>WES: metadata stored
Possibly related PRs
Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
spiffworkflow-backend/src/spiffworkflow_backend/services/logging_service.py (1)
62-100
: Consider extracting property lists to class constants.The property lists for different record types could be moved to class constants to improve maintainability and reusability.
class SpiffLogHandler(SocketHandler): + TASK_PROPERTIES = [ + "workflow_spec", + "task_spec", + "task_id", + "task_type", + "state", + "last_state_change", + "elapsed", + "parent", + ] + + WORKFLOW_PROPERTIES = ["workflow_spec", "completed", "success"] + + EVENT_PROPERTIES = ["bpmn_name", "milestone", "task_id", "task_spec", "metadata", "error_info"] + def filter(self, record: Any) -> bool: if record.name.startswith("spiff"): # ... existing code ... if record.name == "spiff.task": - properties = [ - "workflow_spec", - "task_spec", - "task_id", - "task_type", - "state", - "last_state_change", - "elapsed", - "parent", - ] + properties = self.TASK_PROPERTIES elif record.name == "spiff.workflow": - properties = ["workflow_spec", "completed", "success"] + properties = self.WORKFLOW_PROPERTIES elif record.name == "spiff.event": - properties = ["bpmn_name", "milestone", "task_id", "task_spec", "metadata", "error_info"] + properties = self.EVENT_PROPERTIESspiffworkflow-backend/src/spiffworkflow_backend/services/process_model_service.py (1)
131-146
: Consider tracking these schema suggestions in a GitHub issue.The comments outline a good suggestion for caching metadata extraction paths in database tables. This would improve performance by reducing git operations.
Would you like me to create a GitHub issue to track this database schema enhancement suggestion?
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
spiffworkflow-backend/src/spiffworkflow_backend/config/default.py
(1 hunks)spiffworkflow-backend/src/spiffworkflow_backend/routes/process_instances_controller.py
(1 hunks)spiffworkflow-backend/src/spiffworkflow_backend/services/logging_service.py
(3 hunks)spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_processor.py
(3 hunks)spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_tmp_service.py
(4 hunks)spiffworkflow-backend/src/spiffworkflow_backend/services/process_model_service.py
(1 hunks)spiffworkflow-backend/src/spiffworkflow_backend/services/task_service.py
(1 hunks)spiffworkflow-backend/src/spiffworkflow_backend/services/workflow_execution_service.py
(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (13)
- GitHub Check: typeguard 3.12 / ubuntu-latest sqlite
- GitHub Check: typeguard 3.11 / ubuntu-latest sqlite
- GitHub Check: tests 3.11 / macos-latest sqlite
- GitHub Check: tests 3.10 / ubuntu-latest sqlite
- GitHub Check: tests 3.12 / ubuntu-latest sqlite
- GitHub Check: tests 3.11 / ubuntu-latest sqlite
- GitHub Check: tests 3.12 / ubuntu-latest postgres
- GitHub Check: tests 3.11 / ubuntu-latest postgres
- GitHub Check: tests 3.12 / ubuntu-latest mysql
- GitHub Check: tests 3.11 / ubuntu-latest mysql
- GitHub Check: check_docker_start_script
- GitHub Check: build
- GitHub Check: pixeebot[bot]
🔇 Additional comments (17)
spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_tmp_service.py (4)
36-36
: LGTM! Good addition of thelog_event
parameter.The new parameter provides flexibility to control logging behavior, which is particularly useful when logging needs to be handled elsewhere.
52-53
: Initialize log_extras with task_id early for consistent logging.Good practice to initialize the dictionary early with the base information that will be present in all logs.
89-94
: Comprehensive error information capture.The error information capture is thorough and includes all relevant details (trace, line number, offset, content) that would be helpful for debugging.
99-101
: Well-structured conditional logging.The conditional logging with the comment explaining why some events need to be logged elsewhere is clear and maintainable.
spiffworkflow-backend/src/spiffworkflow_backend/services/logging_service.py (2)
39-46
: Well-structured event format with version control.The new JSON structure follows good practices:
- Includes version for future compatibility
- Uses standardized fields (type, id, source, timestamp)
- Cleanly separates metadata from data
298-302
: Clean and simplified logging interface.The
LoggingService
class has been simplified with a cleaner interface that accepts a message and optional extras. This makes the service more flexible and easier to use.spiffworkflow-backend/src/spiffworkflow_backend/config/default.py (1)
171-171
: LGTM! Good addition of event stream source configuration.The new configuration variable follows the existing pattern and provides a sensible default value.
spiffworkflow-backend/src/spiffworkflow_backend/services/workflow_execution_service.py (3)
44-44
: LGTM! Well-organized imports.The new imports are correctly placed and maintain the file's organization.
Also applies to: 47-47
332-343
: Well-structured logging with comprehensive metadata.The logging implementation captures all relevant task information and metadata in a clean, organized manner.
349-349
: Good integration of milestone tracking with logging.The milestone handling is properly integrated with the logging system and maintains the process instance state correctly.
Also applies to: 356-357
spiffworkflow-backend/src/spiffworkflow_backend/routes/process_instances_controller.py (1)
73-78
: LGTM! Enhanced logging with structured context.The changes improve logging by adding structured context including milestone, process model identifier, and process instance id. This makes the logs more consistent and easier to query.
spiffworkflow-backend/src/spiffworkflow_backend/services/process_model_service.py (1)
147-167
: LGTM! Robust metadata extraction implementation.The implementation:
- Handles missing/empty extraction paths gracefully
- Uses path segments to traverse nested data structures
- Returns empty dict if no extraction paths are configured
- Safely handles missing data paths
spiffworkflow-backend/src/spiffworkflow_backend/services/task_service.py (1)
249-249
: LGTM! Shifted logging responsibility to execution service.The change helps avoid duplicate logging by deferring the logging responsibility to the execution service.
spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_processor.py (4)
996-1000
: LGTM! Clean metadata extraction implementation.The implementation cleanly delegates to ProcessModelService, maintaining separation of concerns.
Line range hint
1002-1014
: LGTM! Robust metadata storage implementation.The implementation:
- Handles both new and existing metadata records
- Properly truncates values to fit database constraints
- Uses appropriate database operations
1170-1176
: LGTM! Enhanced process completion logging.The implementation adds valuable context to process completion logs including milestone and metadata information.
1821-1829
: LGTM! Enhanced task completion logging.The implementation adds comprehensive context to task completion logs including task details and metadata.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
spiffworkflow-backend/src/spiffworkflow_backend/services/logging_service.py (1)
96-101
: Consider extracting property handling logicThe attribute processing logic could be more maintainable. Consider extracting it into a dedicated method.
- for attr in properties: - if hasattr(record, attr): - data[attr] = getattr(record, attr) - if not (data[attr] is None or isinstance(data[attr], dict)): - data[attr] = str(data[attr]) - record._spiff_data = data + def process_attributes(record: Any, properties: list[str], data: dict) -> dict: + for attr in properties: + if hasattr(record, attr): + value = getattr(record, attr) + if not (value is None or isinstance(value, dict)): + value = str(value) + data[attr] = value + return data + + record._spiff_data = process_attributes(record, properties, data)
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
spiffworkflow-backend/src/spiffworkflow_backend/services/logging_service.py
(3 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (12)
- GitHub Check: typeguard 3.11 / ubuntu-latest sqlite
- GitHub Check: tests 3.11 / macos-latest sqlite
- GitHub Check: tests 3.10 / ubuntu-latest sqlite
- GitHub Check: tests 3.12 / ubuntu-latest sqlite
- GitHub Check: tests 3.11 / ubuntu-latest sqlite
- GitHub Check: tests 3.12 / ubuntu-latest postgres
- GitHub Check: tests 3.11 / ubuntu-latest postgres
- GitHub Check: tests 3.11 / ubuntu-latest mysql
- GitHub Check: mypy 3.12 / ubuntu-latest
- GitHub Check: safety 3.12 / ubuntu-latest
- GitHub Check: check_docker_start_script
- GitHub Check: build
🔇 Additional comments (3)
spiffworkflow-backend/src/spiffworkflow_backend/services/logging_service.py (3)
42-46
: Well-structured event format with improved traceability!The new event format follows standard event streaming patterns and improves traceability with:
- Flatter structure for easier consumption
- UUID for event correlation
- Configurable source
- Standard timestamp field
300-304
: Verify all callers of log_event are updatedThe simplified log_event method signature is a good improvement, but we need to ensure all existing callers are updated to use the new format.
Let's verify the usage:
✅ Verification successful
All log_event calls are correctly using the new format
All calls to LoggingService.log_event consistently use the new signature with an event type string and a properly constructed log_extras dictionary. No instances of outdated usage were found.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for all calls to log_event to ensure they've been updated echo "Searching for LoggingService.log_event usage..." rg "LoggingService\.log_event" -B 2 -A 2 # Also check for potential missed imports echo "Checking for logging service imports..." rg "from.*logging_service.*import" -B 1 -A 1Length of output: 6437
Script:
#!/bin/bash # Get more context around the log_event calls to see the parameters echo "Checking log_event call parameters..." rg "LoggingService\.log_event\(" -B 5 -A 1Length of output: 5030
80-95
: Verify event type property mappingsThe specialized handling for different event types (task, workflow, event) looks comprehensive. However, we should verify that all necessary properties are being captured for each type.
Let's verify the property coverage:
✅ Verification successful
Property mappings for event types are correctly implemented
The specialized handling for different event types (task, workflow, event) in the logging service is comprehensive and properly implemented. The test coverage confirms that:
- All defined properties are actively used in workflow execution
- Properties are correctly captured for different event types
- The logging system successfully tracks both detailed and simple logging modes
The property mappings align with the actual usage patterns in the codebase and provide complete coverage of necessary workflow state information.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for usage of these event types to ensure we're capturing all relevant properties echo "Searching for spiff.task usage..." rg "spiff\.task" -B 2 -A 2 echo "Searching for spiff.workflow usage..." rg "spiff\.workflow" -B 2 -A 2 echo "Searching for spiff.event usage..." rg "spiff\.event" -B 2 -A 2Length of output: 2629
Script:
#!/bin/bash # Search for logger calls and property usage echo "Searching for logger calls..." rg "\.info\(|\.debug\(|\.warning\(|\.error\(" -B 3 -A 1 "src/spiffworkflow_backend" echo "Searching for test files with spiff logging..." fd "test.*\.py" | xargs rg "spiff\.(task|workflow|event)" echo "Searching for property usage..." rg "workflow_spec|task_spec|task_id|task_type|state|last_state_change|elapsed|parent|completed|success|bpmn_name|milestone|metadata|error_info" -B 2 -A 2 "src/spiffworkflow_backend"Length of output: 702
Script:
#!/bin/bash # First, let's find the correct paths echo "Finding repository structure..." fd -t d spiffworkflow_backend echo "Finding all Python files..." fd -e py echo "Now search with correct paths..." rg "\.info\(|\.debug\(|\.warning\(|\.error\(" -B 3 -A 1 echo "Searching for test files with spiff logging..." fd -e py | xargs rg "spiff\.(task|workflow|event)" -B 2 -A 2 echo "Searching for property usage..." rg "workflow_spec|task_spec|task_id|task_type|state|last_state_change|elapsed|parent|completed|success|bpmn_name|milestone|metadata|error_info" -B 2 -A 2Length of output: 66422
Script:
#!/bin/bash # Search for SpiffWorkflow related code and property assignments echo "Searching for SpiffWorkflow task/workflow related code..." rg "SpiffTask|SpiffWorkflow" -B 2 -A 2 echo "Searching for property assignments in logging context..." rg "collect_log_extras|log_extras" -B 2 -A 2 echo "Searching for test files with logging assertions..." fd "test_.*\.py" | xargs rg "spiff\.(task|workflow|event).*(workflow_spec|task_spec|task_id|task_type|state|last_state_change|elapsed|parent|completed|success|bpmn_name|milestone|metadata|error_info)"Length of output: 69774
Script:
#!/bin/bash # Search specifically in logging_service.py and related test files echo "Searching logging service implementation..." rg -B 5 -A 5 "def filter" spiffworkflow-backend/src/spiffworkflow_backend/services/logging_service.py echo "Searching for test files related to logging..." rg -l "test.*logging" spiffworkflow-backend/tests/spiffworkflow_backend/ echo "Searching for property assignments in logging tests..." fd "test.*logging.*\.py" | xargs rg "assert.*spiff\.(task|workflow|event)"Length of output: 1168
Script:
#!/bin/bash # Look at the logging service test file content echo "Examining logging service test file..." cat spiffworkflow-backend/tests/spiffworkflow_backend/integration/test_logging_service.py echo "Searching for actual log messages in tests..." rg "spiff\.(task|workflow|event)" spiffworkflow-backend/tests/spiffworkflow_backend/integration/test_logging_service.py -B 2 -A 2Length of output: 5787
Log all events generated by the backend to the event stream, with as much process specific into as possible and simplify the event format.
Summary by CodeRabbit
New Features
Bug Fixes
Refactor
Chores