Hey all,

I am trying to develop a vector based collision detection system and was wondering if anyone has done this before?

Essentially, every 2D point in the world has a (x,y) coordinate and thus with 2 points, you have a vector. Between these 2 points, say we draw a line - and then any object that collides with this line can be calculated using vector and cross multiplication. (Or trig)

Thus, each object needs to be a class that defines its edges. (Ie, a square with edges 0,0; 20,0; 20,20; 0,20) In the root world, the edges are added onto the _parent.movieclips x/y thus giving you a x/y that is on the same plane as the line you wish to detect against.

If we take two points in the world and one point on the object, we can determine if there is a collision by finding the angle between one of the world points and the object point. Using trig, and the distances between the points (a,b,c), the distance between any two points is a^2 = b^2 + c^2 + 2bc(cosA), where A is angle between vector ab and ac. Thus A = cos^-1 (a^2 - b^2 - c^2 / 2bc) is the angle between ab and ac. If this angle approaches 0 (ie. 1x10^-5), then we can assume that the object point is touching the line (collision).

However, we run into a problem where if the object point being measured is beyond the line. Then one point will register 0 rads and the other point should register 3.14 (pi) rads. Thus both angles at both world points must be known. If both angles ~ 0 rads, then the object hits, else the object does not hit.

Anyone care to comment? Do you think this will work? See any flaws?