Suppose I have a point cloud given in 4D space, which can be set up arbitrarily dense. These points will lie on the surface of a polytope, which is an unknown object at this point. (Just to provide some unhelpful visualization, here is a rather arbitrary projection of the point cloud which I have given - i.e. plotting the first 3 columns of the 100000x4 array):
My goal is to obtain the vertices of this polytope, and one of my numerous attempts to get those was computing the convex hull of the point cloud. The problem here was that the number of resulting vertices differed with the specified tolerance and there was no a priori way of checking which one to use. I also noted that the use of the qhull algorithm is not optimal here, as all given points will actually lie on the hull. Now in my former question the answer I accepted suggested setting up a face loss function and using gradient descent. This seems like a nice approach, but I haven't been able to implement it at the time. I assume this would be a linear programming question, which I unfortunately know nothing about. I scanned a few online resources and it seems that finding such extreme points is something that is regularly encountered in that setting (is that what convex optimization is about?), which is why I hope it makes sense to ask a follow up question on SO.
I also hope it doesn't violate etiquette if I paste the suggested approach here:
Let xi in R4, i = 1, ..., m, be the PCA-reduced data points.
Let F = (a, b) be a face, where a is in R4 with a • a = 1 and b is in R.
We define the face loss function L as follows, where λ+, λ- > 0 are parameters you choose. λ+ should be a very small positive number. λ- should be a very large positive number.
L(F) = sumi(λ+ • max(0, a • xi + b) - λ- • min(0, a • xi + b))
We want to find minimal faces F with respect to the loss function L. The small set of all minimal faces will describe your polytope. You can solve for minimal faces by randomly initializing F and then performing gradient descent using the partial derivatives ∂L / ∂aj, j = 1, 2, 3, 4, and ∂L / ∂b. At each step of gradient descent, constrain a • a to be 1 by normalizing.
∂L / ∂aj = sumi(λ+ • xj • [a • xi + b > 0] - λ- • xj • [a • xi + b < 0]) for j = 1, 2, 3, 4
∂L / ∂b = sumi(λ+ • [a • xi + b > 0] - λ- • [a • xi + b < 0])
Note Iverson brackets: [P] = 1 if P is true and [P] = 0 if P is false.
Can someone provide a few pointers how I should approach putting this into code? I suppose I can use Matlab's linprog
for that? For instance, I am not quite certain what it means to randomly initiate a face, and if a and b are unknowns at this point.
Also, here is a link to one of the datasets.
The following uses a RANSAC-approach to fit 2D planes onto a nD point cloud. Afterwards the vertices can be found via intersection of those planes.
Algorithmic idea:
The general idea of RANSAC is really simple.
In our example, we can use it in the following way:
Code needed to implement this:
We need a bit of code to achieve this, which is beyond the scope of this answer:
distancePointsAffineSpace
.affineSpaceIntersection
.uniquetol
. (As of version R2015a, this functionality is built-in.)Implementation:
With this the RANSAC part is pretty simple:
The rest of the code is rather long, so I will just write down the basic idea and link to the code.
Once we have found our model planes we:
You can download the entire code for this here.
Visualization:
It seems you are dealing with the boundary of a 3D Möbius strip made from three extruded triangles and three extruded quadrangles. Here is a 4D rotation of this strip projected into 3D (projected on your 2D screen.)