I think it's important to understand that the metric system is in some ways superior to the english system of measurement, but in most ways equal. That the US is stuck on the English system is an accident of history.
During the middle ages and earlier, every region had its own set of measurements, all a little distinct, and in most cases, the units didn't relate well to each other. France and the rest of Europe pretty much stuck with the old ways--there were at least a dozen distinct units call a foot (pied) in France at the time of the revolution, all a little different from each other and none relating particularly to longer or shorter units. Britain had this same problem, but in the late 1500s, Parliament decided to rationalize this. They determined, by statute, that a foot was 12 inches, a yard was 3 feet, a chain was 22 yards, a furlong was 10 chains, a mile was 8 furlongs, an acre was 10 square chains, (or a furlong by a chain), and so forth. This happened to be at the start of the great period of British colonization, and as a consequence, these statute measurements found close to universal acceptance in the largely British American colonies. They made far greater sense than the mess of the rest of the world.
One of the fads at the time of the revolutions was decimalization. When Hamilton and others were deciding on a currency for their new country, they embraced the new idea, and went with the decimal dollar. The english system of money at the time was a mix of sensible and less so: the pound was 240 pence, split into shillings, but also split into other weird things like a guinea. 240 is divisible by 3, which is not true of decimal systems.
During their revolution, the French decided, finally, to rationalize their system and based it on navigation: there were exactly 10,000,000 meters between the equator and the north pole on the line that ran through the Louvre, in Paris. The US president at the time, Jefferson, thought this was a good idea and got the french to send a standard meter and kilogram to Washington. Unfortunately, the ship was lost at sea and they never got there. Rather than cook up their own, possibly different versions, the Americans stuck with feet, inches and pounds, until another set of standards could be sent from France. Unfortunately, the French government never got its act together to send another.
At the same time, something really amazing was happening--the industrial revolution. This started in Britain (Scotland) but soon spread to America. All of a sudden, standardized parts were all the rage, especially things like Nuts and Bolts, because that made it possible to outsource large parts of your manufacturing process, and suddenly manufacturing was a Big Deal. The US standard, which has come to be called SAE (society of automotive engineers) measured things in fractional inches and threads per inch, but they did something very special: they established a rating system for the strength of these components. An American engineer was unlikely to specify a metric (or whitworth) bolt because it was harder to be confident of its strength.
I am an american trained as a scientist--I am perfectly comfortable with either system. Most of the time it really doesn't matter. 1/4-20 is pretty close to 6mm .8 thread. as long as I know which I'm using, the difference is unimportant. There are a few cases where it does matter. 3/4 inches is a standard width for a lot of things, such as the width of electrical and masking tape. Much of the time it doesn't matter. But when somebody specifies 20mm, as they often do in Europe. it's a bloody pain to find this in America. 3/4" is 19.1 mm. Close, but no cigar. I just needed to get some holddowns for my workbench. American dogholes are 3/4", european dogholes are 20mm. one will not work in the other.