[Checkers] Question about resolving type variables

Michael Ernst mernst at csail.mit.edu
Mon Mar 24 16:44:18 EDT 2008


> Currently, type variables are inferred from the actual arguments  
> passed if possible.  Unfortunately, this may lead to unexpected  
> results, namely violating backward compatibility.  Consider:
> 	List<String> lst = Arrays.asList("1", "m", "n");
> Currently, this will issue a type compatibility error, as the return  
> type of 'Arrays.asList(...)' is List<@Interned String> which is not a  
> subtype of List<String>.

This is a very interesting question!


  Incidentally, whenever there is an interesting design/implementation
  question like this, let's record it somewhere.  If we have to think about
  something, then it is likely to be useful and interesting to others.  If we
  don't describe it in a paper or other document, then other people will have
  to re-discover it for themselves (and possibly claim credit for being the
  first to notice it).  Also, recall that a major reason for the rejection of
  our first paper was that we failed to discuss exactly this sort of issue --
  things that were novel and were lessons learned.  I don't want us to make
  this mistake again.

  I've added this item to file
  /typequals/annotations/papers/checkers-framework/to-do.txt and I would like
  all of you to think about other topics that are worth discussing and to also
  add them there.  They may not go in that particular paper (but they will if
  we can fit them!), but perhaps we'll write a paper with more implementation
  details, or a longer version of that paper.

End of aside.

Now, to address your question:

Most expressions have multiple legal types.  "m" has type @Interned String,
String, @Interned Object, and Object.  Ordinarily it's best to choose the
most specific one, since that will be usable in the most contexts.
However, when the type is used as a generic argument, extra specificity is
no advantage.  The error that the checker gives for

 	List<String> lst = Arrays.asList("1", "m", "n");

is essentially the same as the one that ordinary Java gives for

        List<Object> lst_o = Arrays.asList("1", "m", "n");

Here are some possible approaches:

1. Supply the context (the expected type) to inference, and if that type is
possible to achieve, then use it instead of the most specific type.  (This
would make the "List<Object>" Java example type-check, too!)

 * must supply the type context.  This probably isn't a big deal.
 * sometimes, the type context is not known or is not unique (e.g., when
   there is an overloaded method).  In this case, the inference should fall
   back to the normal behavior.
 * type-checking becomes less modular.  A change in a far-removed part of
   the program may cause this code to type-check differently, for example
   to start or stop issuing a type error.

2. Make the programmer supply an explicit type to override the default
behavior.  I'm not sure whether the default behavior in this case should be
"List<String>" (for backward compatibility) or "List<@Interned String>"
(for precision).  I lean slightly toward the latter but can't justify my

3. Other ideas?

I like the idea of trying #1, but I welcome others' input.


More information about the checkers mailing list