What's the app about | Why this app |
---|---|
Plotting functions and their intersection. | An interesting application of gradient descent. |
I'm a fan of plotting graphs (and visualizations in general). |
Let's say you are giving equations of curves and you need to plot the intersection of these curves. As an example, say you have 2 spheres (3D), how would you plot the intersection of the given spheres?
... x, a & b are vectors of size 3.
My first approach to this problem was finding the equation of intersection of these 2 functions by equating them i.e. F_1(x) = F_2(x). Then trying to simplify the equation and use that equation to plot the points. This approach is not feasible for 2 reasons:
- Equating the 2 functions won't necessarily give you the equation of intersection. For instance, equating 2 equations of spheres will give you intersection plane of the spheres and not the equation of intersecting circle (if any).
- Even if you had an equation, the question still remains, how to plot points from a given equation?
If you observe, points that lie on the intersection of the curves should satisfy all the functions separately i.e.
So, another approach (highly ineffective) would be to generate points randomly everytime and see if they satisfy all the given equations. If it does, it is a valid 'point'. Else, generate another random point and repeat untill you have sufficient points. Downsides of this approach:
- The search space is too big. Even bigger for N-dimensional points.
- Highly ineffective approach. Might take forever to stumble upon such valid points.
Can we modify the previous approach- Instead of discarding an invalid randomly generated point, can we update it iteratively so that it approaches a valid solution? If so, what would it mean to be a valid solution and when should we stop updating the sample?
If the point lies on the intersection of the curves, it should satisfy for all i i.e.
We can define a function as the summation of the given functions to hold the above condition.
So, we can say that a point will be valid when it satisfies G(x) = 0, since it will only hold when all the F_i(x) are zero. This will be our criterion for checking if the point is a valid solution.
However, we are not yet done. The range of G(x) can be from . This means, the minimum value of G(x) is not necessarily 0. This is a problem because if we keep minimizing G(x) iteratively by updating x, the value of G(x) will cross 0 and approach a negative value (it's minima).
This could be solved if the minima of G(x) is 0 itself. This way we can keep updating x until G(x) approaches the minima (0 in this case). So, we need to do slight modification in G(x) such that its minimum value is 0.
My first instict was to define G(x) as the sum of absolute F_i(x) i.e.
The minimum value of this function will be 0 and will hold all the conditions discussed above. However, if we are trying to use Gradient Descent, using modulus operation can be problematic because the function may not remain smooth anymore.
So, what's an easy alternative for modulus operator which also holds the smoothness property? - Use squares!
This function can now be minimised to get the points of intersection of the curves.
- The function will be smooth and continuos. Provided F(x) are themselves smooth and continuous.
- The minimum value of G(x) is zero.
- The minimum value of G(x) represents the interesection of all F_i(x)
Generate a random point x
While G(x) != 0:
x = x - lr * gradient(G(x))
Repeat for N points.
Assumptions:
- Curves do intersect somewhere.
- The individual curves are themselves differentiable.