Software Life cycle discussion
Widget Connector | ||||||
---|---|---|---|---|---|---|
|
Slide Deck
Download the slide deck from here.
Video Transcript
Hello, this is Amir Shehata with another quick tip on the LUTF.
Today I'd like to express my thoughts on the software processpropose a new way of working, which could prove advantageous.
Problems with the Traditional Software Development Model
We all learnt the waterfall model at school. I don't know about you, but in practice it is never is that clean.
It's a lot more iterative.
As we write the HLD, we find we need to role back and adjust the requirements. The Detailed design is often passed over in favour of going directly to implementation, which I can't really disagree with.
The Test Plan is written at the very end.
Bug fixes happen during implementation and testing and often result in design changes and event even requirement changes.
We endup with a bit of a messy development process.
Under deadlines what inevitably ends up happening happens is the documentation goes stale.
...
The trick is to reduce the cost of the solution and maximize the benefit.
Proposed Solution
My proposed solution is to bring the test plan development to the centre of the Software Development ProcessCycle.
A well formed requirement must be testable. Therefore we can represent each requirement as a test case.
Each requirement must be designed. The details of the design can also be added to the test case.
We write the HLD (as we would normally) and can add extra test cases during the design phase as needed. The These test cases would test existing requirements or add new discovered requirements we didn't think of before.could represent new requirements.
When we We move to implementation (either after the HLD design phase is completed or in parallel) and we can still add extra test cases as needed. Again these test cases can test existing requirement or add new onesrepresent new requirements.
This can happen all the way down to the bug fixing phase.
Once the feature is complete we should have a set of test scripts which represent both the requirements, test plan and parts of the design.
We can conceivably then extract all the information needed for these documents.
The LUTF Involvement
Let's now imagine that all these test scripts, which include:
...
With this work flow the requirements, design and the test plan and the test plan implementation is are collapsed into one form: the LUTF test scripts. From the test scripts we can generate our documents.
I believe this reduces the overhead and increases our chances of keeping documentation consistent with the implementation.
...
Each test script should include a documentation block. Not all elements shown here are needed. But the more complete it is the more complete the generated document will be
Code Block |
---|
"""
@PRIMARY: Primary Requirement ID
@PRIMARY_DESC: Primary Requirement ID description
@SECONDARY: Secondary Requirement ID if it exists
@SECONDARY_DESC: Secondary Requirement ID description if it exists
@DESIGN: Design notes and description
@TESTCASE: Test case descrtiption
""" |
It's enclosed in """
.
The LUTF can automatically generate the bulk of the requirements, HLD and test plan documentation from the test scripts.
This method provides the glue between the code and the documentation. As bugs are fixed or the feature updated, the developer should create new test cases to cover the changes made and regenerate the documentation.
The LUTF can generate documentation with the create_docs() command.
Code Block |
---|
suites['samples'].create_docs("samples.csv") |
This will generate two csv files one would be the requirement document with the design and the other is the test plan.
Requirement Document
...
HLD
...
Test Plan
...
NOTE: each secondary requirement in a test case should be a primary requirement in another test case. The idea here is that some test cases might touch on more than one requirement.
The csv files can then be imported directly into the Confluence wiki or embedded in a document.
Once the test plan is reviewed, updated and approved, the script logic can be written in a separate phase.
Caveat
Of course life is always more complicated that it appears on paper.
Diagrams and other mediums might be needed to explain the requirements and the design, which can not be included in the a text only test scriptform.
However, my argument is that the above process can alleviate much of the required document maintenance of the documents required.
Lets Let's take the Requirements document as an example.
The first section can have the overview and diagrammatic explanations required. The second section is the table detailing all the requirements.
Let's look at how all this would look like in the LUTF.
Demonstration
Updates to the second section of the document can be automatically generated from the test scripts.
This method provides the glue between the code and the documentation. As bugs are fixed or the feature updated, as long as the developer creates new test cases to cover the changes made and regenerate the documentation, the code and the documentation will remain in syncLet's run through this process to show how easy it is.