[prev in list] [next in list] [prev in thread] [next in thread] 

List:       gcc-patches
Subject:    Re: Patch: New implementation of -Wstrict-aliasing
From:       Silvius Rus <rus () google ! com>
Date:       2007-01-31 22:00:26
Message-ID: 45C1117A.1020807 () google ! com
[Download RAW message or body]

Gabriel Dos Reis wrote:
> Silvius Rus <rus@google.com> writes:
>
> |[...]
> | Can you please advise on the following choice of warning levels.
> | Please let me know which of the following levels you would like to see
> | in 4.3, in which order.  In all cases, the warnings are conditioned by
> | -fstrict-aliasing.
> | Thank you,
> | Silvius
>
> Hi Silvius,
>
>   Thanks for taking time for expanding on this.
>
> All of the levels seem good to me.  However I would appreciate an
> explanation about the choice of the levels.
>
> First I thought that the accuracy of the diagnostics improves as
> the level increases, meaning less false positives and less false 
> negatives.  
In my first proposal (last week), that was the case.  However, I thought 
it over and realized it exposed too much detail to the user (extensive 
use of --param) with insufficient practical impact.  After I redesigned 
it, the interface looked very similar to the previous -Wstrict-aliasing 
implementation.
> However, I'm under the impression that under your
> proposal, the accuracy decreases as the level increases.  Is that
> reading correct?
>   
That is correct.
> My general thinking was that, the compile-time consumed by warning 
> about non-conforming aliasing will increase with the level.  So, for
> example level=1 will be quick but not very precise, and level=4 might
> be slow but more accurate.  Just like with optimization levels.
> What motivates your choice in the other direction?
>   
I made this choice for consistency with the existent -Wstrict-aliasing.  
In the previous implementation, level 2 was usually less accurate than 
level 1, in the sense that it produces more false positives.  However, I 
agree that ordering them by increasing accuracy (and effort) makes good 
sense by analogy to optimization levels.  I have no problem reversing 
the proposed order, as long as the most accurate one is the default.  
Let me know if you are OK with this revised order:
level 1: least precise due to many false positives
level 2: more precise, still some false positives
level 3 (default): most precise, very few false positives and few false 
negatives
level 4: improvement to level 3, but cannot be default because it breaks 
bootstrap
Also, ordering them by increasing complexity allows us to add future 
detection methods as higher level numbers.  I can see in the future a 
more precise implementation based on virtual SSA rather than points-to 
information.

Talking about levels: Unfortunately, we cannot come up with a perfect 
solution, because "accuracy" does not define a total order on the set of 
analysis methods we have available.  First, we measure accuracy by a 
pair, <false positives, false negatives>.  In the previous 
implementation, level 1 has fewer false positives than level 2, but it 
has more false negatives.  Second, even if we look just at false 
negatives, they are measured as sets, which are not always in an 
inclusion relation with each other (from one -Wstrict-aliasing level to 
another).

The order I am proposing now is clearly biased towards a low count of 
false positives.  I can see that to certain users better accuracy could 
mean a low count of false negatives, and in that case my proposed order 
would seem confusing, but there is just that much you can pack in a 
small integer.


Thank you!
Silvius


[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic