C++ Logo

sg19

Advanced search

Re: [SG19] Jan 13 SG19 call

From: David Lindelof <lindelof_at_[hidden]>
Date: Fri, 15 Jan 2021 08:26:02 -0800
Hey guys,

It had been a while since I attended one of these calls and I’m amazed at
the incredible work that’s been done. Quick question: what’s the preferred
means of discussion/feedback? We used to have a slack channel but it seems
abandoned now. I was curious to know more about the function
differentiation mechanisms in Python among other things.


David Lindelöf, Ph.D.
+41 (0)79 415 66 41
http://davidlindelof.com
Follow me on Twitter:
http://twitter.com/dlindelof


On 14 January 2021 at 22:33:45, Michael Wong via SG19 (sg19_at_[hidden])
wrote:



On Wed, Jan 13, 2021 at 3:21 PM Michael Wong <fraggamuffin_at_[hidden]> wrote:

> SG19 Machine Learning 2 hours. This session will focus on Differential
> Calculus and reinforcement learning but with updates from all the others
> optionally.
>
> Link to Automatic differentiation proposal:
>
>
> https://docs.google.com/document/d/175wIm8o4BNGti0WLq8U6uZORegKVjmnpfc-_E8PoGS0/edit?ts=5fff27cd#heading=h.9ogkehmdmtel
>
> Hi,
>
> Michael Wong is inviting you to a scheduled Zoom meeting.
>
> Topic: SG19 monthly Dec 2020-Feb 2021
> Time: Jan 14, 2020 02:00 PM Eastern Time (US and Canada)
> Every month on the Second Thu, until Feb 11, 2021, 3 occurrence(s)
> Dec 10, 2020 02:00 PM ET 1900 UTC
> Jan 14, 2021 02:00 PM ET 1900 UTC
> Feb 11, 2021 02:00 PM ET 1900 UTC
> Please download and import the following iCalendar (.ics) files to your
> calendar system.
> Monthly:
>
> https://iso.zoom.us/meeting/tJctf-2tpzotGNHL5pZqwtjELee0mcG2zzCi/ics?icsToken=98tyKuCrrjMuH92UtxuCRowqAoqgLO_xmH5ajY11sEr1OTFEdgnTGudHYr98N4rK
>
> Join from PC, Mac, Linux, iOS or Android:
> https://iso.zoom.us/j/93084591725?pwd=K3QxZjJlcnljaE13ZWU5cTlLNkx0Zz09
> Password: 035530
>
> Or iPhone one-tap :
> US: +13017158592,,93084591725# or +13126266799,,93084591725#
> Or Telephone:
> Dial(for higher quality, dial a number based on your current location):
> US: +1 301 715 8592 or +1 312 626 6799 or +1 346 248 7799 or +1
> 408 638 0968 or +1 646 876 9923 or +1 669 900 6833 or +1 253 215 8782
> or 877 853 5247 (Toll Free)
> Meeting ID: 930 8459 1725
> Password: 035530
> International numbers available: https://iso.zoom.us/u/agewu4X97
>
> Or Skype for Business (Lync):
> https://iso.zoom.us/skype/93084591725
>
> Agenda:
>
> 1. Opening and introductions
>
> The ISO Code of conduct:
> https://www.iso.org/files/live/sites/isoorg/files/store/en/PUB100397.pdf
> The IEC Code of Conduct:
>
> https://basecamp.iec.ch/download/iec-code-of-conduct-for-delegates-and-experts/
>
> ISO patent policy.
>
> https://isotc.iso.org/livelink/livelink/fetch/2000/2122/3770791/Common_Policy.htm?nodeid=6344764&vernum=-2
>
> The WG21 Practices and Procedures and Code of Conduct:
>
> https://isocpp.org/std/standing-documents/sd-4-wg21-practices-and-procedures
>
> 1.1 Roll call of participants
>
 Richard Dosselman, Phil ratzloff, Andrew Lumsdaine, Ayenem. David
Lindelof, Cyril Khazan, Eugenio Bargiacchi, Jens Maurer, Joe Sachs, Kevin
Deweese, Larry Lewis, marco foco, Ozan Irsoy, Scott Mcmillan, vassil
vassilev, Will Wray
Michael Wong. William Moses

> 1.2 Adopt agenda
>
LA

1.3 Approve minutes from previous meeting, and approve publishing
> previously approved minutes to ISOCPP.org
>
> 1.4 Action items from previous meetings
>
> 2. Main issues (125 min)
>
> 2.1 General logistics
>
> Meeting plan, focus on one paper per meeting but does not preclude other
> paper updates:
>
> Dec 10, 2020 02:00 PM ET1900 UTC Stats DONE
> Jan 14, 2021 02:00 PM ET 1900 UTCReinforcement Learning and Diff
> Calculus
> Feb 11, 2021 02:00 PM 1900 UTC ET Graph
>
> ISO meeting status
>
> future C++ Std meetings
>
> 2.2 Paper reviews
>
> 2.2.1: ML topics
>
> 2.2.1.1 Differential Calculs:
>
>
> https://docs.google.com/document/d/175wIm8o4BNGti0WLq8U6uZORegKVjmnpfc-_E8PoGS0/edit?ts=5fff27cd#heading=h.9ogkehmdmtel
>
>
> WM:
 Enzyme using IR and not AST
differentiation is first class in other languages
cant train the network without one part of loss function isnt AD compatible
cannot fo BP
so vast set of C++ codebases in ML

numercal
symbolic
AD or algorithmic using chain rule
will generate the dervative function
can do non-closed form expressions
impl exists:
1. use operator overloading, create a new differential double type and it is
AOT compilation, will not work ifipow is not there already
2. source transformation like Tapenade
2 styles:
1. forward mode (1 input and multiple output)
2. reverse mode (multiple inputs and one output, gradienst)

Dual numbers
a+eB is a dual number

needs to be native
op overloading approach(adept) and source rewriters(OpenAD) rewritign
original program to use that specific suset that is differentiable , JAX
does this

AD with language/compiler is efficient
several proof of concept: enzyme and CLAD
Enzyme can work wit hdifferent FEs
for all functsions, will build backward pass
now we can optimize this looks like hand compile derivatives
combining AD with optimization

runnign AD after optimization
loop invariant code motion assuming no aliasing between out and in
using restrict

if you do AD then Code motion, it is O(Nsquare)
if you do code motion then AD then it is O(N)
ADBench from MS testd Enzyme
shows this case

C++ proposal
support AD at a low level enables AD at high level
so if u import eigen library

AL: compose different library, compiler based
will add burden on compiler impl
start with minimal set that compiler differentiate
then custom derivatives and generic functions

impl complexity: reverse mode in clad is 2000 lines


performance numbers with clad? coming

less intrusive like library are evaluated and have less performance

could be niche which make it tough sell
use reflection to inspect ADT
reflection does not work on statement and expressions
P2040 reflxpr touched on expression, but SG7 seems to be opposed to
exposing teh AST
P2237

Library solution would serve no purpose to community on existing codepeople
moving from TF to


people moving from C++ to Tf, TF to PT

what about portable format for modules by Gaby Dos Reis some library
overload the explicit conversion operat

have we exhausted library apporach, as they are old
but are there modern expression template technique
Yes we explored lirbary, even latest one, like AOki which use shadow
language

eve library is higher bar and favors small library

if we dont standardize it, someone will put it in llvm

do not see modules supporting this, Expression template needs ADL and
leaves out whole set of already written C++ code
also issue with scalability like a chain of differentiation, and the space
is large
and will miss optimization opportunity
might be mitigated by CSEE

is there a small set of features that could be provided by compiler that
could make an efficient library implementation? I think nonethat library
could be close to compiler impl
may be CSSE of templates
yes I spent time to understand doing it as a libr with small set of
features from teh compiler, but could not find any that work

continue in this direction, publish the paper



2.2.1.2 Reinforcement Learning Larry Lewis Jorge Silva
>
> Reinforcement Learning proposal:
>
LL:
:
RI is dependent on LA, ML, NN, -> optimizer, AD, data loaders, LA, tensors
may be better to focus on tensors
mdspan and owning mdarray; but they are not enough for a good tensor
library, no common tensor operations, just indexing, strides
NN needs matrix mult
NN needs AD to prevent handcoding BP, especially for CN when doing RGD
theano has AD
LA syntax and LA BLAS
LA syntax adds a definition for arithmetic operators to create matrix and
vector types, currently in LEWG , rebased to C++ 20 using requires clauses
great for mathematics which reduces interface
tensorflow/theano tensor looks more like python
may have to work US NL physics tensors
Heterogeneous tensorflow is currently built on top of SYCL, CUDA
review xtensor, pytorch, and TF tensor ops
optimzers, weights by Andrew Lumsdaine
P1416 by xtensor people
P1415
wg21.link

Data tables from Nvidia building data frame capability xdateframe?



2.2.1.3 Graph Proposal Phil Ratsloff et al
>
> P1709R1: Graph Proposal for Machine Learning
>
> P1709R3:
>
> https://docs.google.com/document/d/1kLHhbSTX7j0tPeTYECQFSNx3R35Mu3xO5_dyYdRy4dM/edit?usp=sharing
>
>
> https://docs.google.com/document/d/1QkfDzGyfNQKs86y053M0YHOLP6frzhTJqzg1Ug_vkkE/edit?usp=sharing
>
> <http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p2119r0.html>
>

split and reduce for C++26
common api?
Concept of algorithm
reducing the number of types and number of functions leveraging what is in
ranges
looking at BGL concept are just handfull
STL is just into concepts and algorithms


2.2.1.4: Stats paper
>
> Stats review Richard Dosselman et al
>
> P1708R3: Math proposal for Machine Learning: 3rd review
>
> PXXXX: combinatorics: 1st Review
>
> > std.org/jtc1/sc22/wg21/docs/papers/2020/p1708r2
> > above is the stats paper that was reviewed in Prague
> > http://wiki.edg.com/bin/view/Wg21prague/P1708R2SG19
> >
> > Review Jolanta Polish feedback.
> > http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p2119r0.html
>


Comninatorics
python has factorials, perm/comb is missing from our math library
had identified the min P1415 overview document
applications beyond SG19
need large numerical type for factorials
python has builtin wide integer type

unbounded template T for a numeric type, we have failed to specify for the
complex of T type for float and long double
what operations are declared on that type, can't just say plug in any T

these could also be in C without the templates, may also consider how to do
it in C




2.2.3 any other proposal for reviews?
>
> 2.3 Other Papers and proposals
>
> P1416R1: SG19 - Linear Algebra for Data Science and Machine Learning
>
> https://docs.google.com/document/d/1IKUNiUhBgRURW-UkspK7fAAyIhfXuMxjk7xKikK4Yp8/edit#heading=h.tj9hitg7dbtr
>
> P1415: Machine Learning Layered list
>
> https://docs.google.com/document/d/1elNFdIXWoetbxjO1OKol_Wj8fyi4Z4hogfj5tLVSj64/edit#heading=h.tj9hitg7dbtr
>
> 2.2.2 SG14 Linear Algebra progress:
> Different layers of proposal
>
> https://docs.google.com/document/d/1poXfr7mUPovJC9ZQ5SDVM_1Nb6oYAXlK_d0ljdUAtSQ/edit
>
> 2.5 Future F2F meetings:
>
> 2.6 future C++ Standard meetings:
> https://isocpp.org/std/meetings-and-participation/upcoming-meetings
>
> None
>
> 3. Any other business
>
> New reflector
>
> http://lists.isocpp.org/mailman/listinfo.cgi/sg19
>
> Old Reflector
> https://groups.google.com/a/isocpp.org/forum/#!newtopic/sg19
> <https://groups.google.com/a/isocpp.org/forum/?fromgroups=#!forum/sg14>
>
> Code and proposal Staging area
>
> 4. Review
>
> 4.1 Review and approve resolutions and issues [e.g., changes to SG's
> working draft]
>
> 4.2 Review action items (5 min)
>
> 5. Closing process
>
> 5.1 Establish next agenda
>
> TBD
>
> 5.2 Future meeting
>
>
> Feb 11, 2021 02:00 PM 1900 UTC ET Graph paper
>
-- 
SG19 mailing list
SG19_at_[hidden]
https://lists.isocpp.org/mailman/listinfo.cgi/sg19

Received on 2021-01-15 10:26:06