Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create logging files to automate testing scenarios #190

Open
darbyje opened this issue Mar 26, 2024 · 0 comments
Open

Create logging files to automate testing scenarios #190

darbyje opened this issue Mar 26, 2024 · 0 comments

Comments

@darbyje
Copy link

darbyje commented Mar 26, 2024

It currently appears not possible to store the results of a test when it has been logged. At production scale with a significant number of test cases, the ability to be able to develop a test script with all the scenarios, and store the outcome of each scenario in a results file like a csv will allow users to hone in on the failed tests.

Anticipating that a normal test cycle would run multiple rounds, with defects resolved and another testing run performed, it should create the file based on the run time of the script.

Further to this, being able to automate this on a CRON job, and then review output the file would allow the administrator to monitor the effect of changes in NLU (Addition/deletion of utterances) and perform regression testing to ensure the intent model does not move backwards as more phrases are added to the bot during the normal tuning process.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant