Independently whether hadron or electron colliders detectors, the most precisely measured are directions of muons, followed by directions of electrons and later lepton energies.
That is why precision requirements are defined by: size of alpha_QED necessity to define objects like intermediate bosons on one side, and detector details: granularity and background subtraction requirements on the other.
There can be discussed 3 levels for ambiguities: not better than 1 %, around 0.3 and finally better than 0.1 %. They require different level of sophistication on, Field theory based theoretical predictions, to complete tests of Standard Model. Agreement mean confirmation and discrepancy point to New Physics (or faulty calculations).
As important experience I will point to precision measurements of LEP 1 time, where precision of 0.043% was reached for theory experiment
comparisons.
What this experience may hint for the future, where for example, FCC measurements of 0.01 % or even better ambiguities are expected. In particular, what requirements of theory predictions this will require: for Monte Carlo programs, for prediction of idealized/optimal observables, possibly to be used in fits or in ML applications.