I’m using the old laws of physics to try and calculate a trajectory for an object in my game.
I’m using the old
launch vel = sqrt(range * gravity) / sin(2*theta)
where theta is the angle of launch.
Now my constant for gravity is the number (9) and the game runs at 30 fps. Am I right in thinking the acceleration due to gravity is 9 units a second if I divide 9 by 30? (i.e. 9/30 = amount to accelerate y by each frame)
I’m trying to plug gravity into this equation, as I’m launching at 45 degrees with a range of 200 and I seem to be landing about 48.2 units short of the target…(151.8 instead of 200)
I’m applying 9/30 to the y velocity of the object every frame which should yield an overall accel of 9 units a sec - does that sound right?
My launch speed ends up at 9.2 units/sec and the peak of the travel is around 38 units - I’m pretty sure this peak should be 50
I’m adding the calculated velocity to the velx and vely of the object at launch taking into account the launch angle (using cos/sin) but still it’s short…
any ideas?