Just curious, when weighing a coin for authenticity, what is the +/- range you consider within range?
I assume we're talking about modern coins? Many ancient and medieval coins have a much wider tolerance.
I can't speak about ancients, but most medieval coins had much stricter tolerances for weight than we do today. A general rule of thumb for modern coins is 1%. But the larger the coin the larger the tolerance. Some are over 2%.
But, the more valuable the coin, the more cautious the minters were likely to be. Weights were much more strictly controlled for Venetian ducats than for Sicilian follaros. Roman coins have a wider tolerance partially because of the mint structure. The aureus was struck at a certain number to the pound of gold. Therefor, so long as 240 aurei weighed a pound, the individual weights were able to fluctuate a bit.
Of course there will always be tolerances in any coin when worn even slightly. And I'm sure that the USMint also has some tolerances in the original productions of coins. And don't forget that the scale you use too may not be as accurate as advertised. HMMM. Starting to sound like a + or - of about 100%. :goofer: Just kidding of course but are you referring to grams of accuracy? Are you interested in a possible 0.1, 0.01, 0.001 accuracy?
Silver and gold tend to have very strict tolerances, around 1% for gold and 1 to 1.5% when new and a little more as they wear. With US gold coins they were to be recalled and recoined when they were off weight by, I believe, 3% due to wear. Silver would probably be a little more. Base metals were more lax, running around 4% when new.