Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Software Life cycle discussion

Widget Connector
width640
urlhttps://www.youtube.com/watch?v=S1nrPaQaViM
height360

Slide Deck

Download the slide deck from here.

Video Transcript

Hello, this is Amir Shehata with another quick tip on the LUTF.

Today I'd like to express my thoughts on the software process.propose a new way of working, which could prove advantageous.

Problems with the Traditional Software Development Model

We all learnt the waterfall model at school. I don't know about you, but in practice it is never is that clean.

It's a lot more iterative.

As we write the HLD, we find we need to role back and adjust the requirements. The Detailed design is often passed over in favour of going directly to implementation, which I can't really disagree with.

A compromise is to develop the HLD in enough details that we can jump directly into implementation.

.

The Test Plan is written at the very end.

Bug fixes and testing often result in design changes and even requirement changes.

We endup with a bit of a messy development processIn my experience what often happens is that the Test plan is left to the very end. When we get to it we sometimes discover a missing requirement or a missing element of the design.

Under deadlines what inevitably ends up happening happens is the documentation goes stale.

...

I contend that there exists no zero effort solution. Any solution will need adjusting the development workflow, which will appear like extra effort.

The trick is to reduce the cost of the solution and maximize the benefit.

Before I get to my proposal, a disclaimer. I'm a strong proponent of formalized documentation, but I'm flexible on the form this formalized documentation looks like.

One of the big problems with maintaining a test plan is developers need to create a test plan document which is separate from the test scripts.

What, inevitably, ends up happening is that as the product evolves, bugs get fixed, etc, more test cases are added, but the test plan is not updated.

Even more problematic is that the design diverges and the HLD and/or requirements are not updated.

The LUTF provides a way to resolve this issue.

Each test script should include a documentation block formatted as follows:

Code Block
"""
@PRIMARY: sample_01
@SECONDARY: sample_01
@DESCRIPTION: Simple Hello Lustre test
"""

It's enclosed in """ .

  • @PRIMARY: is the primary requirement this script fulfills
  • @SECONDARY: is the secondary requirement, if any, this script fulfills
  • @DESCRIPTION: Is a detailed description of the test case

By specifying these, the LUTF can generate automatic documentation for the test scripts. We no longer need to maintain a separate test plan. Our test scripts become our test plan.

This method provides the glue between the code and the documentation. As bugs are fixed or the feature updated, the developer should create a test case to cover the changes made.

Each test case should have a documentation block.

The LUTF then provides a method to extract the documentation block and create a table which can then be imported to the wiki. In this way it becomes much easier for the developer to maintain appropriate documentation without much overhead.

One way of working would be to create a skeleton of the scripts which just include the above comment block. Then run the create_docs() command on the suite

Code Block
suites['samples'].create_docs("samples.csv")

This will generate a csv file with all the documentation.

This can then be imported directly into the Confluence wiki.

Once the test plan is reviewed, updated and approved, the script logic can be written in a separate phase.

Proposed Solution

My proposed solution is to bring the test plan development to the centre of the Software Development Cycle.

A well formed requirement must be testable. Therefore we can represent each requirement as a test case.

Each requirement must be designed. The details of the design can also be added to the test case.

We can add extra test cases during the design phase as needed. These test cases could represent new requirements.

When we move to implementation (either after the design phase is completed or in parallel) we can still add extra test cases. Again these test cases can represent new requirements.

This can happen all the way down to the bug fixing phase.

Once the feature is complete we should have a set of test scripts which represent the requirements, test plan and parts of the design.

We can conceivably then extract all the information needed for these documents.

The LUTF Involvement

Let's now imagine that all these test scripts, which include:

  • the requirements
  • the details on how the Requirements will be designed and
  • the test case description

are LUTF test scripts.

We can use the LUTF to automatically extract all this information from the test scripts and generate our documentation.

As you can see we no longer need to keep 3 different documents: A requirements document, an HLD document and a Test plan document.

With this work flow the requirements, design and the test plan are collapsed into one form: the LUTF test scripts. From the test scripts we can generate our documents.

I believe this reduces the overhead and increases our chances of keeping documentation consistent with the implementation.

Caveat

Of course life is always more complicated that it appears on paper.

Diagrams and other mediums might be needed to explain the requirements and the design, which can not be included in a text only form.

However, my argument is that the above process can alleviate much of the required document maintenance.

Let's take the Requirements document as an example.

The first section can have the overview and diagrammatic explanations required. The second section is the table detailing all the requirements.

Updates to the second section of the document can be automatically generated from the test scripts.

This method provides the glue between the code and the documentation. As bugs are fixed or the feature updated, as long as the developer creates new test cases to cover the changes made and regenerate the documentation, the code and the documentation will remain in syncLet's run through this process to show how easy it is.