/* ---- Google Analytics Code Below */

Wednesday, December 28, 2022

On Speculative Execution

 Out of order, and dynamic. 

What Is Speculative Execution?

By Joel Hruska on June 10, 2022

With a new Apple security flaw in the news, it’s a good time to revisit the question of what speculative execution is and how it works. This topic received a great deal of discussion a few years ago when Spectre and Meltdown were frequently in the news and new side-channel attacks were popping up every few months.

Speculative execution is a technique used to increase the performance of all modern microprocessors to one degree or another, including chips built or designed by AMD, ARM, IBM, and Intel. The modern CPU cores that don’t use speculative execution are all intended for ultra-low power environments or minimal processing tasks. Various security flaws like Spectre, Meltdown, Foreshadow, and MDS all targeted speculative execution a few years ago, typically on Intel CPUs.

What Is Speculative Execution?

Speculative execution is one of three components of out-of-order execution, also known as dynamic execution. Along with multiple branch prediction (used to predict the instructions most likely to be needed in the near future) and dataflow analysis (used to align instructions for optimal execution, as opposed to executing them in the order they came in), speculative execution delivered a dramatic performance improvement over previous Intel processors when first introduced in the mid-1990s. Because these techniques worked so well, they were quickly adopted by AMD, which used out-of-order processing beginning with the K5.

ARM’s focus on low-power mobile processors initially kept it out of the OOoE playing field, but the company adopted out-of-order execution when it built the Cortex A9 and has continued to expand its use of the technique with later, more powerful Cortex-branded CPUs.

Here’s how it works. Modern CPUs are all pipelined, which means they’re capable of executing multiple instructions in parallel, as shown in the diagram below.  .... ' 

No comments: