The position of a point in a plane is usually described using Cartesian coordinates. The position of a point $P$ in the plane is described by noting its displacement from the origin in the direction of the $x$ axis, and its displacement from the origin in the direction of the $y$ axis, giving coordinates $(x,y)$.

In polar coordinates, a point $P$ is described by specifying a distance $r$, which is the distance from the origin to the point $P$ along the radius direction, and an angle $\theta$ measured from the horizontal axis anti-clockwise to the line $r$, giving coordinates $(r,\theta)$.

|center

Converting from Cartesian to polar

Suppose we are given the Cartesian coordinates of a point $P=(x,y)$ and want to convert them to polar form. $r$ is the distance from the origin to the point $P$. To find $r$, use Pythagoras' theorem $x^2+y^2=r^2$.

So $r$ is found by: \[r = \sqrt{x^2+y^2}\]

$\theta$ is the angle measured from the axis towards $r$. To find $\theta$, use SOHCAHTOA: