[Dune] ParameterTree

Carsten Gräser graeser at math.fu-berlin.de
Fri Jan 28 15:02:10 CET 2011


Am 28.01.2011 13:13, schrieb Jö Fahlke:
> Am Fri, 28. Jan 2011, 02:10:40 +0100 schrieb Carsten Gräser:
>>> I always understood the implicit determination of the conversion type as the
>>> official way to select the particular overload of the get()-method with
>>> default parameter, and I'm using it extensively.  The simple fact that these
>> Indeed this was what I originally intended. However it does not do what you
>> naturally expect: Select the type of the return value.
> 
> It doesn't!?  As far as I can tell it does, along with the type for the
> conversion.  And yes, "char*" as the default parameter selects "std::string"
> for the return type and the conversion type, but nobody wants "char*" as
> return type anyway, and this allows us to use string literals as default
> values.  I probably don't get what you are trying to tell me here.
> 
>>                                                        Instead it selects
>> the typ of the expression that might be different.
> 
> Now you have completely lost me.  Which expression do you mean here?
> 
If you use

  T t = pTree.get("key", expression);

what you really want will almost always be the string->T conversion.
Instead of this the string->typeof(expression) conversion is used.
For example the user might expect

  double x = pTree.get("key", 0);

to parse the value as double which is not done. In this case it's
safe because you get an exception. In others it need not to be safe.

>>> methods were sometimes implemented as overloads made this the only reliable
>>> way.  I consider The fact that most of these overloads are provided by a
>>> method template an implementation detail.
>> In my opinion it provides a natural way to select a type: using a template parameter.
>>
>>> Instead I propose specialize the ParameterTree::Parser struct for char to read
>>> a numerical value instead of a character.
>> You might also want the character depending on your application.
>> The main reason to forbid the implicit type determination is that
>> similar unexpected behaviour might appear if the user defines the
>> conversion for custom types.
> 
> If you want the character, you can always read a std::string, check that it's
> size exactly 1 and take it's first character.  If we keep it as it is, and you
> want the numerical value, you can convert the default value to an int and
> convert the return value back to a char, while checking that the return value
> actually fits in a char.  Either way you make it a litte harder for some
> poeple, and I don't care much which way we choose.  I believe that it is a
> little more likely that people want the numerical value for a char (I've run
> into the same problem before...)
> 
> As to custom conversions, do you mean IO-conversions (operator<<) or type
> conversions here?  In both cases I don't see a problem that could be solved by
> specifying the conversion type explicitly.

Currently you can simply specialize Parser<A> and Parser<B> such that
the constructer A(B) exists and is commenly used. If you don't have

  Parser<A>::parse(string)  == (A)(Parser<B>::parse(string))

the following will not do the same

  B b;
  A a1 = pTree.get("key", b);
  A a1 = pTree.get<A>("key", b);

while the later is what you want in most cases. One example would be the following:
Suppose you have specialized Parser<R> for some type R for exact rational numbers
that can e.g. be constructed from double. Then

  R r = pTree.get<R>("key", 0.0):

will result in r==1/10 while

  R r = pTree.get("key", 0.0);

will result in r!=1/10. Further examples are

 std::vector<int> v;

 long long ll = pTree.get("key", v.size());
 std::cout << ll << std::endl;

 std::cout << pTree.get("key", v.size()) << std::endl;

 int i        = pTree.get("key", v.size());
 std::cout << i << std::endl;

Guess what happens for key=-1. Only the last one prints -1, funny, isn't it?


> The problem here is really much more fundamental than just deciding on the
> right way to parse a char; the interpretation of the parameter value string
> really depends on more than just the type of the variable the result is gonna
> be assigned to.  There are two solutions to this: either we provide some
> random interpretation based on what is more convenient most of the time; if
> that does not suit you, you have to extract the string value and do the
> interpretation yourself.  Or we invent a full-blown and complex system of
> parsers, making it possible to specify the desired interpretation based on the
> type the application uses to store the value, some application-specific
> interpretation (do you interprete vector<pair<int,int> > as a list of ranges
> or as a list of mappings from one integer to another?), some site-specific
> configuration format style (does a vector<double> have to look like "0.5 4
> 7.3" or "0.5, 4, 7.3"), and potentially many more things.  We currently are
> much closer to the former solution, and I suggest to keep it that way.

Best,
Carsten




More information about the Dune mailing list