Skip to content

Milliseconds to Microseconds (ms to µs) Converter

1 ms = 1,000 µs

1 Millisecond equals 1,000 Microseconds (1 ms = 1,000 µs). Convert Milliseconds to Microseconds with formula, table, and examples.

One millisecond equals exactly 1,000 microseconds. To convert milliseconds to microseconds, multiply by 1,000. This direction of conversion is most useful when you need to express a human-scale or software-scale timing value in the finer hardware units that underlie it. This conversion comes up frequently when moving from high-level specifications down to hardware requirements. A system designer might specify that a sensor must respond within 2 ms, then need to express that as 2,000 µs to compare it against the microsecond-level timing constraints of the underlying hardware interface. Similarly, a network administrator who receives a latency budget in milliseconds must convert it to microseconds when configuring hardware queuing rules or timestamping systems. In audio production, this conversion is particularly common. A reverb tail of 1.2 ms needs to be expressed as 1,200 µs when setting hardware delay buffer sizes in audio interfaces that operate at the sample level. At a 48 kHz sample rate, one sample is about 20.8 µs, so a 1.2 ms delay corresponds to approximately 57.6 samples — and getting that count right requires working in microseconds. In scientific instrumentation, timing pulses specified in milliseconds by a protocol must be translated into microsecond-precision trigger settings on oscilloscopes and pulse generators. The conversion is trivial arithmetically but essential for correctness.

How to Convert Milliseconds to Microseconds

µs = ms × 1,000
Multiply the value in Milliseconds by 1,000
  1. Take your value in Milliseconds
  2. Multiply by 1,000
  3. Read the result in Microseconds

Common Milliseconds to Microseconds Conversions

Milliseconds (ms) Microseconds (µs) Status
0.001 ms 1 µs
0.01 ms 10 µs
0.1 ms 100 µs
0.5 ms 500 µs
1 ms 1,000 µs
5 ms 5,000 µs
10 ms 10,000 µs
16.667 ms 16,667 µs
50 ms 50,000 µs
100 ms 100,000 µs
250 ms 250,000 µs
500 ms 500,000 µs
1,000 ms 1,000,000 µs
5,000 ms 5,000,000 µs
10,000 ms 10,000,000 µs

Good to Know About Milliseconds to Microseconds Conversion

The millisecond is the smallest unit of time that most people ever encounter in daily life — in ping displays, audio latency settings, and sports timing. Converting it to microseconds reveals the invisible hardware world beneath these familiar numbers. A 16 ms frame budget sounds manageable; 16,000 µs sounds enormous for a CPU — yet both describe the same constraint.

Milliseconds to Microseconds: What You Need to Know

Converting milliseconds to microseconds is a routine step in hardware bring-up and system integration work. When a software engineer hands a hardware engineer a timing specification in milliseconds, the hardware engineer will almost always convert it to microseconds — or even nanoseconds — before translating it into register values, clock divider settings, or timer configurations. In the world of real-time operating systems (RTOS), task scheduling is specified in milliseconds at the application level but the underlying tick timer runs in microseconds or even nanoseconds. A task specified to run every 10 ms translates to a timer interrupt every 10,000 µs, which the kernel schedules by counting microsecond-level ticks. In telecommunications, timing tolerances for synchronization protocols such as IEEE 1588 Precision Time Protocol (PTP) are often specified in milliseconds at the system level but must be maintained at the microsecond level by the hardware clocks. A 1 ms synchronization window becomes a 1,000 µs hardware constraint that the PHY chip and timestamp unit must meet. For photographers and videographers, flash synchronization timing is specified in milliseconds (a shutter sync speed of 1/250s is 4 ms), but the actual flash duration and delay are measured in microseconds. Understanding both scales and the conversion between them is part of mastering high-speed photography.

What is a Millisecond? ms

One thousandth of a second. The standard unit for measuring human reaction times, network latency, audio processing, and sports timing.

Metric SI network latency (ping) sports timing audio and video production
Learn more about Millisecond →

What is a Microsecond? µs

One millionth of a second. Used in electronics, radar, radio transmission, and scientific instrumentation where milliseconds are too coarse.

Metric SI radar pulse timing radio wave transmission CPU cache latency
Learn more about Microsecond →

Going the other way? Use our Microseconds to Milliseconds converter.

Milliseconds to Microseconds FAQ

  • There are exactly 1,000 microseconds in one millisecond. Since the millisecond is 10⁻³ seconds and the microsecond is 10⁻⁶ seconds, converting from milliseconds to microseconds always involves multiplying by 1,000.

  • Multiply the number of milliseconds by 1,000. For example, 3.5 ms × 1,000 = 3,500 µs. For 0.1 ms, the result is 100 µs.

  • Use microseconds when the value is smaller than about 0.5 ms, when you are interfacing with hardware that operates at the microsecond level, or when decimal precision in milliseconds would be unwieldy. For example, 0.025 ms is cleaner as 25 µs, and hardware datasheets almost always use microseconds for sub-millisecond timing.

Non-Frequently Asked Questions About Milliseconds to Microseconds

Questions nobody should ask - but someone did.

  • Saying 'microsecond' takes about 0.4 to 0.5 seconds for most speakers, which is 400,000 to 500,000 microseconds. In that time, you have spoken one word and a modern CPU has executed roughly 1.2 to 1.5 billion instructions. The word is not pulling its weight.

  • The tortoise moves at about 0.27 m/s, so in 1 ms (1,000 µs) it covers about 0.00027 mm — roughly one quarter of a micrometer. The tortoise does not win, but it does get participation points for existing at a timescale that is at least not completely imperceptible.

  • Not really — nature does not schedule events in round SI units. However, the period of a 1,000 Hz sound wave is exactly 1 ms. Concert A is 440 Hz (period: 2.27 ms). The lowest note on a standard piano, A0, is 27.5 Hz — a period of about 36,360 µs or 36.36 ms. None of these are exactly 1 ms, which is nature's quiet commentary on the arbitrariness of human unit systems.

Need the reverse? Use our Microseconds to Milliseconds converter. See all Time converters.