Percentage error is calculated by taking the difference between a measured value and an accepted value, dividing that difference by the accepted value, and then multiplying by 100%. For example, if the measured atomic molar mass of lithium is 8 g/mol and the accepted value is 6.94 g/mol, the difference is 1.06 g/mol, resulting in a percent error of +15.27%. In situations where the true value is unknown, the estimated error can be divided by the measured value instead. Understanding these calculations is essential for accurately assessing measurement accuracy. This method provides a framework for evaluating errors in scientific measurements.