Date: Sun, 22 Oct 2017 18:25:08 +0800
On 10/22/2017 1:52 PM, Nicolai Josuttis wrote:
> Oops,
> sorry, but I STRONGLY disagree here.
>
> Please correct me if I am wrong.
> Especially if you know better HOW to solve the problem
> feature macros solve, instead.
Keep your shirt on. I'm not a little inexperience middle manager that
can be intimidated by your superior reputation or by liberal use of
CAPITAL LETTERS.
We need a serious discussion.
>
> Am 22.10.2017 um 04:28 schrieb Bjarne Stroustrup:
>> I see the utility of feature macros for library implementers, especially
>> for supporting old compilers. I hear the argument that they allow faster
>> use of new and experimental features, but I don't think that argument
>> has been sufficiently well articulated and weighed against the damage
>> done to the majority uses of the language and the evolution of the
>> language and its tools infrastructure.
>>
> We use a real problem here.
> We have a problem that ONLY can be solved by preprocessor abilities.
> The whole PURPOSE of the preprocessor including macros is to deal with
> implementation details.
>
> It is a fact that implementation only partially support things
> and don't give any portable hint what they support.
> (Microsoft even doesn't handle __cplusplus is a useful fashion.)
> This will not change.
I said that I understood the advantage to library writers, but I also
asked for a better articulation of the problem. I obviously cannot offer
an improved solution to a weakly articulated problem. I cannot even
decide whether the problem is among those that it is important to solve
in the standard given our limited time and resources. Furthermore, for
every every new feature we should considered not just the problems it
solves, but also what problems it might introduce and what existing
problems it might make worse. I feel this is lacking in this case.
Also, "The whole PURPOSE of the preprocessor including macros is to deal
with implementation details" is not historically accurate, and even if
it was that would not imply that it was a good solution to the
implementation problems that it is used to mitigate.
>
>> I fear the utility of feature macros to define dialects: "if (feature I
>> don't like) static_assert(false, "don't use this; it's bad for you)."
>> Fortunately this is (for me) mainly hypothetical, but over the years I
>> have had *many* hopeful questions along the lines of "is there a
>> standard way of disabling X?". I strongly prefer to leave such potential
>> major damage for local gain to non-standard hacks.
>>
> How can a feature macro be used to disable a feature???
Simple. As I said above:
static_assert(false, "don't use this; it's bad for you");
Place this in a header and require it for corporate use and you are done.
I hope I'm not giving you new bad ideas. As I said, the equivalent has been frequently suggested to me over the years.
>
>
>> Worse: the more macros, the harder it is to develop new tools for C++.
>> Marcos is a major reason C++ is tool poor compared to more modern
>> alternatives.
>>
> So, the only other option we have is to use platform specific "feature
> macros". That is a lot worse because the resulting code is hard to write
> and hard to maintain.
> And the tool problem, you have in mind, Bjarne, will be a lot worse with
> that.
> So NOT having feature macros does not solve any problem you describe,
> but makes things significantly worse.
So you say. My guess is that you are flat wrong. Use of macros is at the
root of most of our tool problems. I think the best you could argue is
that adding a few dozen more standard macros won't make it worse (even
though that set is open-ended). I won't say that the burden of proof is
on proposers of more macros, but I will say that we need to consider
this problem before adding many more macros to the standard. The work to
start that discussion fairly falls on the proposers.
Theoretically, localizing the use of macros could alleviate the problem,
but I have not seen a proposal for that (unless you count my old
macroscope proposal).
>
>> Worse still: Over the last few weeks, I have been traveling, talking
>> with *many* senior developers from a variety of major C++ users. They
>> are excited about the promise of modules. In particular, many see it as
>> an opportunity to finally cleaning up their code to get it more hygienic
>> by getting rid of most macros, undisciplined aliases, and forward
>> declarations. Adding macros to the standard could reduce the internal
>> representation(s) of modules to token soup, and give us "modules" that
>> do not offer real modularity.
>>
> Can you give a concrete example, please?
Look at any representation for representing a module (the IPR
http://www.stroustrup.com/gdr-bs-macis09.pdf ) would be an example. Each
time you have a macro, you need to represent the alternatives. These
alternatives need not maintain scope or grammar rules, so the
representation of cannot be a proper AST, but reduces to "token soup". A
lot of such token soup defeats the aims of modules, i.e. modularity. For
example:
int f (int
#ifdef FOO
, int
#else
, double
#endif
);
#ifdef BAR
do_something {
#endif
....
#ifdef BAR
}
#endif
I have seen these and far worse in production code, and I assume that so
have you. I'm sure that we can get many more examples if we start looking.
>
>> In haste: I am still traveling. I recommend treading very carefully
>> here. I am not a fan of macros, not even feature macros.
>>
> Obviously ;-)
> But you don't want to deprecate #define, don't you? ;-)
> (I am a bit honest here, there is a good reason we have it, for cases
> that can't be solved by core language features).
I don't think there are good reasons except compatibility and the
management of language evolution. In principle, we can do without
#define . My long-term aim is to eliminate #define. By long-term, I do
not mean next year or C++20. To be responsible stewards of C++'s
evolution, we must think in decades, and try not to make problems worse
by short-term fixes.
>
> IMO, we have a useful feature that solves a significant problem.
Which problem exactly? (you repeat yourself, so I do too)
> It does NOT introduce the problem Bjarne fears (because it can't).
I disagree (see above).
> Not having it is a lot worse.
You repeat yourself again, presupposing your conclusion.
> So what's the problem?
That the problem to be solved has not (to my knowledge) been carefully
articulated and its side effects (incl. long-term effects) have not been
carefully explored.
Note that I do not assume that a principled solution to your problem is
impossible. However, a good solution may not be more macro hacking. For
example, note that I strongly opposed the original compile-time if
because it had the problems with tooling and ASTs that I mentioned
above, but I accepted the current compile-time if because it does not
have those problems.
>
>
>> On 10/21/2017 9:03 AM, Herb Sutter wrote:
>>>>> Furthermore, since we usually are in the business of standardizing
>>>>> things that users otherwise have to write many times themselves: Has
>>>>> SG10 considered actively defining a <std-forward-compat> header
>>>>> library that does the above for all the things it can, the idea being
>>>>> that users who have to target multiple implementations at various
>>>>> stages of conformance can include <std-forward-compat> after all their
>>>>> standard library's own headers and write their code more closely
>>>>> against the actual latest IS's std:: library,
>>>>> without having to reinvent the above by hand (incompatibly on
>>>>> different systems), as a transition tool to help encourage people to
>>>>> adopt the latest standard?
>>>> That's a very interesting idea, but might it make more sense for it
>>>> to be
>>> done
>>>> by LWG, as opposed to SG10?
>>> Whoever owns the feature tests should own how to use and adopt them so I
>>> would think the <std-forward-compat> header would fall under that. After
>>> all, it should be kept in sync with the feature tests.
>>>
>>> So initially to get the ball rolling, since we have a set of tests
>>> already
>>> that are maintained by SG10, wouldn't it be expected that the initial
>>> header
>>> to be created by SG10 too -- again, just to get the ball rolling?
>>>
>>> Going forward, if we standardize feature tests, then we will presumably
>>> expect each proposal author to suggest a feature test (where appropriate)
>>> and a <std-forward-compat> mechanism (if appropriate).
>>>
>>> Is that reasonable?
>>>
>>>
>>> _______________________________________________
>>> Features mailing list
>>> Features_at_[hidden]
>>> http://www.open-std.org/mailman/listinfo/features
>>
> Oops,
> sorry, but I STRONGLY disagree here.
>
> Please correct me if I am wrong.
> Especially if you know better HOW to solve the problem
> feature macros solve, instead.
Keep your shirt on. I'm not a little inexperience middle manager that
can be intimidated by your superior reputation or by liberal use of
CAPITAL LETTERS.
We need a serious discussion.
>
> Am 22.10.2017 um 04:28 schrieb Bjarne Stroustrup:
>> I see the utility of feature macros for library implementers, especially
>> for supporting old compilers. I hear the argument that they allow faster
>> use of new and experimental features, but I don't think that argument
>> has been sufficiently well articulated and weighed against the damage
>> done to the majority uses of the language and the evolution of the
>> language and its tools infrastructure.
>>
> We use a real problem here.
> We have a problem that ONLY can be solved by preprocessor abilities.
> The whole PURPOSE of the preprocessor including macros is to deal with
> implementation details.
>
> It is a fact that implementation only partially support things
> and don't give any portable hint what they support.
> (Microsoft even doesn't handle __cplusplus is a useful fashion.)
> This will not change.
I said that I understood the advantage to library writers, but I also
asked for a better articulation of the problem. I obviously cannot offer
an improved solution to a weakly articulated problem. I cannot even
decide whether the problem is among those that it is important to solve
in the standard given our limited time and resources. Furthermore, for
every every new feature we should considered not just the problems it
solves, but also what problems it might introduce and what existing
problems it might make worse. I feel this is lacking in this case.
Also, "The whole PURPOSE of the preprocessor including macros is to deal
with implementation details" is not historically accurate, and even if
it was that would not imply that it was a good solution to the
implementation problems that it is used to mitigate.
>
>> I fear the utility of feature macros to define dialects: "if (feature I
>> don't like) static_assert(false, "don't use this; it's bad for you)."
>> Fortunately this is (for me) mainly hypothetical, but over the years I
>> have had *many* hopeful questions along the lines of "is there a
>> standard way of disabling X?". I strongly prefer to leave such potential
>> major damage for local gain to non-standard hacks.
>>
> How can a feature macro be used to disable a feature???
Simple. As I said above:
static_assert(false, "don't use this; it's bad for you");
Place this in a header and require it for corporate use and you are done.
I hope I'm not giving you new bad ideas. As I said, the equivalent has been frequently suggested to me over the years.
>
>
>> Worse: the more macros, the harder it is to develop new tools for C++.
>> Marcos is a major reason C++ is tool poor compared to more modern
>> alternatives.
>>
> So, the only other option we have is to use platform specific "feature
> macros". That is a lot worse because the resulting code is hard to write
> and hard to maintain.
> And the tool problem, you have in mind, Bjarne, will be a lot worse with
> that.
> So NOT having feature macros does not solve any problem you describe,
> but makes things significantly worse.
So you say. My guess is that you are flat wrong. Use of macros is at the
root of most of our tool problems. I think the best you could argue is
that adding a few dozen more standard macros won't make it worse (even
though that set is open-ended). I won't say that the burden of proof is
on proposers of more macros, but I will say that we need to consider
this problem before adding many more macros to the standard. The work to
start that discussion fairly falls on the proposers.
Theoretically, localizing the use of macros could alleviate the problem,
but I have not seen a proposal for that (unless you count my old
macroscope proposal).
>
>> Worse still: Over the last few weeks, I have been traveling, talking
>> with *many* senior developers from a variety of major C++ users. They
>> are excited about the promise of modules. In particular, many see it as
>> an opportunity to finally cleaning up their code to get it more hygienic
>> by getting rid of most macros, undisciplined aliases, and forward
>> declarations. Adding macros to the standard could reduce the internal
>> representation(s) of modules to token soup, and give us "modules" that
>> do not offer real modularity.
>>
> Can you give a concrete example, please?
Look at any representation for representing a module (the IPR
http://www.stroustrup.com/gdr-bs-macis09.pdf ) would be an example. Each
time you have a macro, you need to represent the alternatives. These
alternatives need not maintain scope or grammar rules, so the
representation of cannot be a proper AST, but reduces to "token soup". A
lot of such token soup defeats the aims of modules, i.e. modularity. For
example:
int f (int
#ifdef FOO
, int
#else
, double
#endif
);
#ifdef BAR
do_something {
#endif
....
#ifdef BAR
}
#endif
I have seen these and far worse in production code, and I assume that so
have you. I'm sure that we can get many more examples if we start looking.
>
>> In haste: I am still traveling. I recommend treading very carefully
>> here. I am not a fan of macros, not even feature macros.
>>
> Obviously ;-)
> But you don't want to deprecate #define, don't you? ;-)
> (I am a bit honest here, there is a good reason we have it, for cases
> that can't be solved by core language features).
I don't think there are good reasons except compatibility and the
management of language evolution. In principle, we can do without
#define . My long-term aim is to eliminate #define. By long-term, I do
not mean next year or C++20. To be responsible stewards of C++'s
evolution, we must think in decades, and try not to make problems worse
by short-term fixes.
>
> IMO, we have a useful feature that solves a significant problem.
Which problem exactly? (you repeat yourself, so I do too)
> It does NOT introduce the problem Bjarne fears (because it can't).
I disagree (see above).
> Not having it is a lot worse.
You repeat yourself again, presupposing your conclusion.
> So what's the problem?
That the problem to be solved has not (to my knowledge) been carefully
articulated and its side effects (incl. long-term effects) have not been
carefully explored.
Note that I do not assume that a principled solution to your problem is
impossible. However, a good solution may not be more macro hacking. For
example, note that I strongly opposed the original compile-time if
because it had the problems with tooling and ASTs that I mentioned
above, but I accepted the current compile-time if because it does not
have those problems.
>
>
>> On 10/21/2017 9:03 AM, Herb Sutter wrote:
>>>>> Furthermore, since we usually are in the business of standardizing
>>>>> things that users otherwise have to write many times themselves: Has
>>>>> SG10 considered actively defining a <std-forward-compat> header
>>>>> library that does the above for all the things it can, the idea being
>>>>> that users who have to target multiple implementations at various
>>>>> stages of conformance can include <std-forward-compat> after all their
>>>>> standard library's own headers and write their code more closely
>>>>> against the actual latest IS's std:: library,
>>>>> without having to reinvent the above by hand (incompatibly on
>>>>> different systems), as a transition tool to help encourage people to
>>>>> adopt the latest standard?
>>>> That's a very interesting idea, but might it make more sense for it
>>>> to be
>>> done
>>>> by LWG, as opposed to SG10?
>>> Whoever owns the feature tests should own how to use and adopt them so I
>>> would think the <std-forward-compat> header would fall under that. After
>>> all, it should be kept in sync with the feature tests.
>>>
>>> So initially to get the ball rolling, since we have a set of tests
>>> already
>>> that are maintained by SG10, wouldn't it be expected that the initial
>>> header
>>> to be created by SG10 too -- again, just to get the ball rolling?
>>>
>>> Going forward, if we standardize feature tests, then we will presumably
>>> expect each proposal author to suggest a feature test (where appropriate)
>>> and a <std-forward-compat> mechanism (if appropriate).
>>>
>>> Is that reasonable?
>>>
>>>
>>> _______________________________________________
>>> Features mailing list
>>> Features_at_[hidden]
>>> http://www.open-std.org/mailman/listinfo/features
>>
Received on 2017-10-22 12:25:17