[Dune] ParameterTree

Jö Fahlke jorrit at jorrit.de
Fri Jan 28 13:13:08 CET 2011


Am Fri, 28. Jan 2011, 02:10:40 +0100 schrieb Carsten Gräser:
> > I always understood the implicit determination of the conversion type as the
> > official way to select the particular overload of the get()-method with
> > default parameter, and I'm using it extensively.  The simple fact that these
> Indeed this was what I originally intended. However it does not do what you
> naturally expect: Select the type of the return value.

It doesn't!?  As far as I can tell it does, along with the type for the
conversion.  And yes, "char*" as the default parameter selects "std::string"
for the return type and the conversion type, but nobody wants "char*" as
return type anyway, and this allows us to use string literals as default
values.  I probably don't get what you are trying to tell me here.

>                                                        Instead it selects
> the typ of the expression that might be different.

Now you have completely lost me.  Which expression do you mean here?

> > methods were sometimes implemented as overloads made this the only reliable
> > way.  I consider The fact that most of these overloads are provided by a
> > method template an implementation detail.
> In my opinion it provides a natural way to select a type: using a template parameter.
>
> > Instead I propose specialize the ParameterTree::Parser struct for char to read
> > a numerical value instead of a character.
> You might also want the character depending on your application.
> The main reason to forbid the implicit type determination is that
> similar unexpected behaviour might appear if the user defines the
> conversion for custom types.

If you want the character, you can always read a std::string, check that it's
size exactly 1 and take it's first character.  If we keep it as it is, and you
want the numerical value, you can convert the default value to an int and
convert the return value back to a char, while checking that the return value
actually fits in a char.  Either way you make it a litte harder for some
poeple, and I don't care much which way we choose.  I believe that it is a
little more likely that people want the numerical value for a char (I've run
into the same problem before...)

As to custom conversions, do you mean IO-conversions (operator<<) or type
conversions here?  In both cases I don't see a problem that could be solved by
specifying the conversion type explicitly.

The problem here is really much more fundamental than just deciding on the
right way to parse a char; the interpretation of the parameter value string
really depends on more than just the type of the variable the result is gonna
be assigned to.  There are two solutions to this: either we provide some
random interpretation based on what is more convenient most of the time; if
that does not suit you, you have to extract the string value and do the
interpretation yourself.  Or we invent a full-blown and complex system of
parsers, making it possible to specify the desired interpretation based on the
type the application uses to store the value, some application-specific
interpretation (do you interprete vector<pair<int,int> > as a list of ranges
or as a list of mappings from one integer to another?), some site-specific
configuration format style (does a vector<double> have to look like "0.5 4
7.3" or "0.5, 4, 7.3"), and potentially many more things.  We currently are
much closer to the former solution, and I suggest to keep it that way.

Bye,
Jö.

-- 
<Ku]aku> seen _Armus_
-:- SignOff Ku]aku: #macht (changing servers)
<Volk> I don't know who _Armus_ is.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 828 bytes
Desc: Digital signature
URL: <https://lists.dune-project.org/pipermail/dune/attachments/20110128/5052636c/attachment.sig>


More information about the Dune mailing list