Fractional measurements are better than decimal measurements for anything where the level of precision is important.
Decimal measurements can only increase our decrease in precision by a factor of 10.
For example if your precision is accurate to 1/4 of a unit, you can represent that with fractions no problem.
What is that in decimal? "0.25" implies precision to the hundredth of a unit.
What if your measurement is half a unit, but it's precise to 1/64 of a unit? Just don't reduce the fractions. "32/64ths" is more precise than .5.