
Originally Posted by
RedlumX
Guys,
I think one shouldn't overestimate the possible accuracy of power meter. Even when factory new, the good ones from eg Newport etc are not more accurate than a few percent. And to keep this, one needs to have the meter recalibrated every year or so. I have seen meters that were off by more than 20% whose last cal date was just a few years back. I guess to hope that with any sort of DYI sensor one could better do than 10% is unrealistc (unless very lucky). Just vary the beam size and location over the sensor and you typically see already a variation by a few %, and this even for commercial sensors (I happened to have one of those Ophirs in the past as well). And thermal sensors are anyway very sensitive to background IR, just walking next to the table of the meter can make the reading change by a few mW.
So all-in-all, there is no need to involve 16bit ADCs etc, this just creates a false impression of accuracy. If you want to be realistic, aim at 10% unless you really have a NIST calibrated transfer standard and regularly recalibrate (even transferring a calibration from sensor A to sensor B is tricky and easily buys you a few percent of inaccuracy unless extremely careful - this involves a very controlled environment, and at least a parallel monitoring of laser power - recall that lasers often fluctuate in their power due to mode cycling etc).
There is a very good reason why commercial meters are so expensive - and its not in the electronics!