WICSA 2011 WS1a Evaluation
From WICSA Conference Wiki
- "Quantifying the Analyzability of Software Architectures", by Eric Bouwers, Correia, Arie van Deursen, Visser
- "SOFAS: A Lightweight Architecture for Software Analysis as a Service", by Giacomo Ghezzi, Gall
- "Analysis and Management of Architectural Dependencies in Iterative Release Planning", by Brown, Rod Nord, Ipek Ozkaya
- "Exploring Approaches of Integration Software Architecture Modeling with Quality Analysis Models", by Liliana Dobrica
- Please sign your name here if you are thinking of attending this session. (Click the signature button in the editor). Tell us something about your background. Add a few sentences about the working session topic such as your position, questions you would like to see discussed, etc.
- --Bob Schwanke, Siemens Corporate Research. Stakeholder concerns should be testable, early and often.
- --Liliana Dobrica
- --Eric Bouwers
- --Giacomo Ghezzi
If you want participants to read something relevant, insert materials or references here:
Session Chairs: Write something provocative and witty that will entice people to sign up for this session.
Session Chairs: How you plan to organize the session
Here is what was on the blackboards at the end of the session.
Different types of decomposition
- e.g. what developers say
- usually functional
- alternatively, using a language definition that has a formal definition of "component".
- What do you ask the expert when validating a metric against expert judgment
- Watch for bias
- Don't tip your hand to the experts as to what kind of answer you are looking for
- Make sure the components that experts describe to you are matched to portions of the code, so that no code is missed or assigned to more than one component.
Preliminary system decomposition is used to partition a large team, even in agile projects.
"Lightweight web-service architecture"
- Open Repository vs. proprietary data
- Specifying inputs to desired analysis
- Handling incomplete data, especially issue tracking
- Incremental re-analysis
Ease of analysis
- to compute
- to use results
- power of results
- clean model or language, with powerful but hidden semantics.
What is the effect of the size of an organization on architecture vs. agility?
Incentives for modellilng architecture and predicting quality?
- cost of failure
- cost of recall
- cost of rework
Optimizing decision of when to refactor
What architectural health qualities can be checked at every build?
The term "quality" needs to be used carefully:
- Architectural quality is any property that correlates with the value of the system or its architecture, positively or negatively. A quality can be vague.
- Quality measure is a measure (in the mathematical sense) that makes concrete one or more qualities.
- Quality requirement is a system requirement expressed in terms of one or more quality measures, usually with specific thresholds for adequacy, preference, and often for the point of diminishing returns.
It is difficult to find measures for some qualities.
It can also be difficult to manage the side effects of telling developers the measures. They will often distore their designs and implementations to improve the apparent quality as measured by the declared measures.
To obtain metrics on the structure of the system, first that structure must be recoverable from the implementation, or verifiable with respect to the archtitectural description.
We discussed one case study where the project leadership defined a set of executable checks for system consistency with the architecture. The first time they analyzed the code they found 82,000 violations. Now the developers run the analysis before committing their code, and there are no violations detetected after integration.
What are the costs of measuring quality?
- Global repository
- Analysis tools
After the session, analyze notes and say something profound.
Here is a good place to continue the discussion after the session, if you are so moved.