Correct Answer - Option 2 : 1 and 3 only
Statement 1 and 3 are correct, because accuracy is defined in terms of limits of error.
Limiting error = \(\rm \frac{\delta A}{A} \times 100\)
Also accuracy is defined in terms of the true value of quantity being measured.%
%Error(t) = \(\rm \frac{A_m - A_T}{A_T}\times 100\)
Statement 2 is wrong because accuracy at any point cannot define the accuracy of instrument in general because accuracy also depends on the calibration of scale.