The application of best practices used in the software development influenced by existing experience.
This process assumes an agile development approach but can be adapted to non-agile aswell.
The agents:
SystemArchitect - resp. for the design of the features (PoW: SDD-Draft)
The Reviewer - Reviews the SDD-Draft and records findings - Reviewer gives Review done flag. (PoW: SDD-v1)
The Testers: Implements testcases based on SDD-v1 (PoW: Implemented TC commited to GIT)
The Developers: Implements features based on SDD-v1 (PoW: Implemented features commited to GIT)
The CM/QM: Sets up a CI service - utilizing the GIT repository that was used by Dev & Test - that provides Tests results . Based upon predefined quality criteria features are marked as done.
The customer orderes a set of features by mail.
(Task 1 - Elias) Those are designed by the artificial designer and documented in a draft design document.
(Task 2 - Aida) The tester/student assumes the role of the reviewer. As a reviewer the task at hand is to evaluate design documents for the features that should be implemented in this sprint. The reviewer receives an email notification that a design document is ready to be reviewed. He then proceeds to login and sees the document with the possibility of marking the faulty lines. If any lines are marked the document is sent back into the design process. After redesign and/or rejecting of remarks the document is sent to the reviewer again for inspection.Should he deem the document “good enough” he can conclude the review and mark it done.
(Task 3 - Martin) Depending on the work done by the reviewer (course-students) the design paper provided to the artifical implementer (automated task) will have a varying quality level.
(Task 4 - Martin) The same applies to the artificial tester (automated task) who will also base his test case implementation on the same design document.
(Task 5 - Oliver) The last task will be done by the artifical quality manager who is in charge of building the implementation and test cases and do a full test run of the ordered features. Depending on a pre defined quality level the features will be marked as either as Pass or Failed (there might be a third state like passed with prejudice depending on necessity).
The report of all completed & passed features will be provided by mail to a "customer".
Features cannot be marked done if software quality is subpar (to be defined).
KPIs:
1. Time (whole process & review iterations)
2. Ordered requirements vs full filled requirements
3. Number of review iterations/feature
4. Number of process completions vs fails
We assumed that we will need 5 Tasks - but it seems that Martin Kurzmann will not be participating in our group - as we have yet been unable to contact him and he failed to respond to any of our mails. Also he has not taken any of the quizzes mandatory.
So should he not be joining we might adapt the process a little to match our team.