The setpoint is basically a measure of the force applied by the tip to the sample. In contact mode, it is a certain deflection of the cantilever. This deflection is maintained by the feedback electronics, so that the force between the tip and and sample is kept constant. In tapping mode, it is a certain amplitude (amplitude of oscillation of the cantilever), which controls the force with which the tip taps on the sample. Again, the set amplitude is maintained by the feedback electronics.

Setpoint is expressed differently for different instruments. So, it is very IMPORTANT that you check your instrument manual to find out how it works in your case. For some instruments, a small set point, means a low force applied to the sample, whereas for some, a small set point means a large force. This apparent contradiction can even change from one mode to the other, on the same system.

A large force applied to the sample, often means better imaging, but also means more wear on the tip, and the sample , i.e. lower tip life, and less chance of getting a complete sample without the tip getting contaminated / broken. So, generally you should start with a "safe" value of the setpoint (e.g. just touching the sample) and adjust it slowly until imaging does not improve anymore, then stop.

The "best" setpoint can vary from tip to tip, and sample to sample, please remember that there is no "golden number" for the setpoint. If someone tells you a certain value is ideal before you start imaging, you should take this with a pinch of salt, and instead optimise the value based on what you see. Having said all that, read your user manual, and it may tell you what is the best initial value to use for your system.