You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+63-38Lines changed: 63 additions & 38 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,16 +14,15 @@ DynamicExpressions.jl is the backbone of
14
14
15
15
## Summary
16
16
17
-
A dynamic expression is a snippet of code that can change throughout
18
-
runtime - compilation is not possible!
17
+
A dynamic expression is a snippet of code that can change throughout runtime - compilation is not possible! DynamicExpressions.jl does the following:
18
+
1. Defines an enum over user-specified operators.
19
+
2. Using this enum, it defines a [very lightweight and type-stable data structure](https://symbolicml.org/DynamicExpressions.jl/dev/types/#DynamicExpressions.EquationModule.Node) for arbitrary expressions.
20
+
3. It then generates specialized [evaluation kernels](https://github.com/SymbolicML/DynamicExpressions.jl/blob/fe8e6dfa160d12485fb77c226d22776dd6ed697a/src/EvaluateEquation.jl#L29-L66) for the space of potential operators.
21
+
4. It also generates kernels for the [first-order derivatives](https://github.com/SymbolicML/DynamicExpressions.jl/blob/fe8e6dfa160d12485fb77c226d22776dd6ed697a/src/EvaluateEquationDerivative.jl#L139-L175), using [Zygote.jl](https://github.com/FluxML/Zygote.jl).
22
+
5. It can also operate on arbitrary other types (vectors, tensors, symbols, strings, etc.) - see last part below.
23
+
24
+
It also has import and export functionality with [SymbolicUtils.jl](https://github.com/JuliaSymbolics/SymbolicUtils.jl), so you can move your runtime expression into a CAS!
19
25
20
-
DynamicExpressions.jl:
21
-
1. Defines an enum over user-specified scalar operators.
22
-
2. Using this enum, it defines a very lightweight
23
-
and type-stable data structure for arbitrary expressions.
24
-
3. It then generates specialized evaluation kernels for
25
-
the space of potential operators.
26
-
4. It also generates kernels for the first-order derivatives, using [Zygote.jl](https://github.com/FluxML/Zygote.jl).
27
26
28
27
## Example
29
28
@@ -41,29 +40,27 @@ X = randn(Float64, 2, 100);
41
40
expression(X) # 100-element Vector{Float64}
42
41
```
43
42
44
-
### Speed
43
+
(We can construct this expression with normal operators, since calling `OperatorEnum()` will `@eval` new functions on `Node` that use the specified enum.)
44
+
45
+
## Speed
45
46
46
-
First, what happens if we naively use Julia symbols to define
47
-
and then evaluate this expression?
47
+
First, what happens if we naively use Julia symbols to define and then evaluate this expression?
48
48
49
49
```julia
50
50
@btimeeval(:(X[1, :] .*cos.(X[2, :] .-3.2)))
51
51
# 117,000 ns
52
52
```
53
53
54
-
This is quite slow, meaning it will be hard to
55
-
quickly search over the space of expressions.
56
-
Let's see how DynamicExpressions.jl compares:
54
+
This is quite slow, meaning it will be hard to quickly search over the space of expressions. Let's see how DynamicExpressions.jl compares:
57
55
58
56
```julia
59
57
@btimeexpression(X)
60
58
# 693 ns
61
59
```
62
60
63
-
Much faster!
64
-
And we didn't even need to compile it.
65
-
If we change `expression` dynamically with a random number generator,
66
-
it will have the same performance:
61
+
Much faster! And we didn't even need to compile it. (Internally, this is calling `eval_tree_array(expression, X, operators)` - where `operators` has been pre-defined when we called `OperatorEnum()`).
62
+
63
+
If we change `expression` dynamically with a random number generator, it will have the same performance:
67
64
68
65
```julia
69
66
@btimebegin
@@ -72,7 +69,6 @@ it will have the same performance:
72
69
end
73
70
# 842 ns
74
71
```
75
-
76
72
Now, let's see the performance if we had hard-coded these expressions:
So, our dynamic expression evaluation is about the same (or even a bit faster)
85
-
as evaluating a basic hard-coded expression!
86
-
Let's see if we can optimize the hard-coded version:
80
+
So, our dynamic expression evaluation is about the same (or even a bit faster) as evaluating a basic hard-coded expression! Let's see if we can optimize the speed of the hard-coded version:
87
81
88
82
```julia
89
83
f_optimized(X) =begin
90
84
y =Vector{Float64}(undef, 100)
91
-
@inbounds@simdfor i=1:100;
85
+
@inbounds@simdfor i=1:100
92
86
y[i] = X[1, i] *cos(X[2, i] -3.2)
93
87
end
94
88
y
@@ -97,14 +91,9 @@ end
97
91
# 526 ns
98
92
```
99
93
100
-
The `DynamicExpressions.jl` version is only 25% slower than one which
101
-
has been optimized by hand into a single SIMD kernel! Not bad at all.
94
+
The `DynamicExpressions.jl` version is only 25% slower than one which has been optimized by hand into a single SIMD kernel! Not bad at all.
102
95
103
-
More importantly: we can change `expression` throughout runtime,
104
-
and expect the same performance.
105
-
This makes this data structure ideal for symbolic
106
-
regression and other evaluation-based searches
107
-
over expression trees.
96
+
More importantly: we can change `expression` throughout runtime, and expect the same performance. This makes this data structure ideal for symbolic regression and other evaluation-based searches over expression trees.
108
97
109
98
110
99
## Derivatives
@@ -122,8 +111,7 @@ x2 = Node(; feature=2)
122
111
expression = x1 *cos(x2 -3.2)
123
112
```
124
113
125
-
We can take the gradient with respect to inputs
126
-
with simply the `'` character:
114
+
We can take the gradient with respect to inputs with simply the `'` character:
127
115
128
116
```julia
129
117
grad = expression'(X)
@@ -133,13 +121,20 @@ This is quite fast:
133
121
134
122
```julia
135
123
@btime expression'(X)
136
-
#2.894 us
124
+
#2894 ns
137
125
```
138
126
139
-
Internally, this is calling the `eval_grad_tree_array` function,
140
-
which performs forward-mode automatic differentiation
141
-
on the expression tree with Zygote-compiled kernels.
142
-
We can also compute the derivative with respect to constants:
127
+
and again, we can change this expression at runtime, without loss in performance!
128
+
129
+
```julia
130
+
@btimebegin
131
+
expression.op =rand(1:3)
132
+
expression'(X)
133
+
end
134
+
# 3198 ns
135
+
```
136
+
137
+
Internally, this is calling the `eval_grad_tree_array` function, which performs forward-mode automatic differentiation on the expression tree with Zygote-compiled kernels. We can also compute the derivative with respect to constants:
143
138
144
139
```julia
145
140
result, grad, did_finish =eval_grad_tree_array(expression, X, operators; variable=false)
@@ -151,3 +146,33 @@ or with respect to variables, and only in a single direction:
151
146
feature =2
152
147
result, grad, did_finish =eval_diff_tree_array(expression, X, operators, feature)
153
148
```
149
+
150
+
## Generic types
151
+
152
+
> Does this work for only scalar operators on real numbers, or will it work for `MyCrazyType`?
153
+
154
+
I'm so glad you asked. `DynamicExpressions.jl` actually will work for **arbitrary types**! However, to work on operators other than real scalars, you need to use the `GenericOperatorEnum` instead of the normal `OperatorEnum`. Let's try it with strings!
155
+
156
+
```julia
157
+
x1 =Node(String; feature=1)
158
+
```
159
+
This node, will be used to index input data (whatever it may be) with `selectdim(data, 1, feature)`. Let's now define some operators to use:
160
+
```julia
161
+
my_string_func(x::String) ="Hello $x"
162
+
163
+
operators =GenericOperatorEnum(;
164
+
binary_operators=[*],
165
+
unary_operators=[my_string_func],
166
+
extend_user_operators=true)
167
+
```
168
+
Now, let's create an expression:
169
+
```julia
170
+
tree = x1 *" World!"
171
+
tree(["Hello", "Me?"])
172
+
# Hello World!
173
+
```
174
+
So indeed it works for arbitrary types. It is a bit slower due to the potential for type instability, but it's not too bad:
0 commit comments