Artificial_precision

Artificial precision

Artificial precision

Add article description


In numerical mathematics, artificial precision is a source of error that occurs when a numerical value or semantic is expressed with more precision than was initially provided from measurement or user input. For example, a person enters their birthday as the date 1984-01-01 but it is stored in a database as 1984-01-01T00:00:00Z which introduces the artificial precision of the hour, minute, and second they were born, and may even affect the date, depending on the user's actual place of birth. This is also an example of false precision, which is artificial precision specifically of numerical quantities or measures.

See also

References

  • Smith, N. J. J. (2008). "Worldly Vagueness and Semantic Indeterminacy". Vagueness and Degrees of Truth. p. 277. doi:10.1093/acprof:oso/9780199233007.003.0007. ISBN 9780199233007.



Share this article:

This article uses material from the Wikipedia article Artificial_precision, and is written by contributors. Text is available under a CC BY-SA 4.0 International License; additional terms may apply. Images, videos and audio are available under their respective licenses.