# Milliseconds to Microseconds (ms to µs)

Source: https://www.unitconvertercalculator.com/time/milliseconds-to-microseconds/

**1 ms = 1000 µs**

One millisecond equals exactly 1,000 microseconds. To convert milliseconds to microseconds, multiply by 1,000. This direction of conversion is most useful when you need to express a human-scale or software-scale timing value in the finer hardware units that underlie it.

This conversion comes up frequently when moving from high-level specifications down to hardware requirements. A system designer might specify that a sensor must respond within 2 ms, then need to express that as 2,000 µs to compare it against the microsecond-level timing constraints of the underlying hardware interface. Similarly, a network administrator who receives a latency budget in milliseconds must convert it to microseconds when configuring hardware queuing rules or timestamping systems.

In audio production, this conversion is particularly common. A reverb tail of 1.2 ms needs to be expressed as 1,200 µs when setting hardware delay buffer sizes in audio interfaces that operate at the sample level. At a 48 kHz sample rate, one sample is about 20.8 µs, so a 1.2 ms delay corresponds to approximately 57.6 samples — and getting that count right requires working in microseconds.

In scientific instrumentation, timing pulses specified in milliseconds by a protocol must be translated into microsecond-precision trigger settings on oscilloscopes and pulse generators. The conversion is trivial arithmetically but essential for correctness.

## Formula

Multiply the millisecond value by 1,000

## Conversion Table

| Milliseconds (ms) | Microseconds (µs) |
|---|---|
| 0.001 ms | 1 µs |
| 0.01 ms | 10 µs |
| 0.1 ms | 100 µs |
| 0.5 ms | 500 µs |
| 1 ms | 1000 µs |
| 5 ms | 5000 µs |
| 10 ms | 10000 µs |
| 16.667 ms | 16667 µs |
| 50 ms | 50000 µs |
| 100 ms | 100000 µs |
| 250 ms | 250000 µs |
| 500 ms | 500000 µs |
| 1000 ms | 1000000 µs |
| 5000 ms | 5000000 µs |
| 10000 ms | 10000000 µs |

## Units

### Millisecond (ms)

One thousandth of a second. The standard unit for measuring human reaction times, network latency, audio processing, and sports timing.

### Microsecond (µs)

One millionth of a second. Used in electronics, radar, radio transmission, and scientific instrumentation where milliseconds are too coarse.

## Background

Converting milliseconds to microseconds is a routine step in hardware bring-up and system integration work. When a software engineer hands a hardware engineer a timing specification in milliseconds, the hardware engineer will almost always convert it to microseconds — or even nanoseconds — before translating it into register values, clock divider settings, or timer configurations.

In the world of real-time operating systems (RTOS), task scheduling is specified in milliseconds at the application level but the underlying tick timer runs in microseconds or even nanoseconds. A task specified to run every 10 ms translates to a timer interrupt every 10,000 µs, which the kernel schedules by counting microsecond-level ticks.

In telecommunications, timing tolerances for synchronization protocols such as IEEE 1588 Precision Time Protocol (PTP) are often specified in milliseconds at the system level but must be maintained at the microsecond level by the hardware clocks. A 1 ms synchronization window becomes a 1,000 µs hardware constraint that the PHY chip and timestamp unit must meet.

For photographers and videographers, flash synchronization timing is specified in milliseconds (a shutter sync speed of 1/250s is 4 ms), but the actual flash duration and delay are measured in microseconds. Understanding both scales and the conversion between them is part of mastering high-speed photography.

## Good to Know

The millisecond is the smallest unit of time that most people ever encounter in daily life — in ping displays, audio latency settings, and sports timing. Converting it to microseconds reveals the invisible hardware world beneath these familiar numbers. A 16 ms frame budget sounds manageable; 16,000 µs sounds enormous for a CPU — yet both describe the same constraint.

## FAQ

### How many microseconds are in a millisecond?

There are exactly 1,000 microseconds in one millisecond. Since the millisecond is 10⁻³ seconds and the microsecond is 10⁻⁶ seconds, converting from milliseconds to microseconds always involves multiplying by 1,000.

### How do I convert milliseconds to microseconds?

Multiply the number of milliseconds by 1,000. For example, 3.5 ms × 1,000 = 3,500 µs. For 0.1 ms, the result is 100 µs.

### When should I express a value in microseconds instead of milliseconds?

Use microseconds when the value is smaller than about 0.5 ms, when you are interfacing with hardware that operates at the microsecond level, or when decimal precision in milliseconds would be unwieldy. For example, 0.025 ms is cleaner as 25 µs, and hardware datasheets almost always use microseconds for sub-millisecond timing.

## Non-Frequently Asked Questions

### How many microseconds does it take to say the word 'microsecond'?

Saying 'microsecond' takes about 0.4 to 0.5 seconds for most speakers, which is 400,000 to 500,000 microseconds. In that time, you have spoken one word and a modern CPU has executed roughly 1.2 to 1.5 billion instructions. The word is not pulling its weight.

### If a tortoise races a 1-millisecond event, who wins?

The tortoise moves at about 0.27 m/s, so in 1 ms (1,000 µs) it covers about 0.00027 mm — roughly one quarter of a micrometer. The tortoise does not win, but it does get participation points for existing at a timescale that is at least not completely imperceptible.

### Is there anything that takes exactly 1 millisecond in nature?

Not really — nature does not schedule events in round SI units. However, the period of a 1,000 Hz sound wave is exactly 1 ms. Concert A is 440 Hz (period: 2.27 ms). The lowest note on a standard piano, A0, is 27.5 Hz — a period of about 36,360 µs or 36.36 ms. None of these are exactly 1 ms, which is nature's quiet commentary on the arbitrariness of human unit systems.

## Related Articles

- [Why We Measure: The Deepest Urge in Human Civilisation](https://www.unitconvertercalculator.com/blog/why-we-measure)
- [How We Invented Time: The Strange History of Seconds, Minutes and Hours](https://www.unitconvertercalculator.com/blog/how-we-invented-time)

## See Also

- [Microseconds to Milliseconds](https://www.unitconvertercalculator.com/time/microseconds-to-milliseconds/)
