diff --git a/docs/guides/software-lifecycle/continuous-testing/README.md b/docs/guides/software-lifecycle/continuous-testing/README.md index fd37a0c9a..5956dfe74 100644 --- a/docs/guides/software-lifecycle/continuous-testing/README.md +++ b/docs/guides/software-lifecycle/continuous-testing/README.md @@ -239,7 +239,7 @@ The following are suggested LLM prompts to use with automated generation. unittest.main() ``` -#### Use Case 2: Writing System Tests using Downloadable, Open Source Code Generation Models and Behavior-Driven Development Frameworks +#### 2.2 Example: Writing System Tests using Downloadable, Open Source Code Generation Models and Behavior-Driven Development Frameworks System-level tests that involve interaction with external software or files can be complicated to test. Please refer to our [System Testing section within our Testing Frameworks Guide](testing-frameworks#for-system-testing) for recommended system testing tools that leverage behavior-driven development (BDD) philosophies. One such example, [**Robot Framework**](https://github.com/robotframework/QuickStartGuide/blob/master/QuickStart.rst), is demoed below in collaboration with Large-Language Models. Specifically, using LLMs to auto-generate Robot Framework pseudocode can help streamline the process of creating integration test cases. Complicated system testing scenarios can be simplified and written automatically using LLMs. Here's an example: