Intel Patents Dynamic Core Swapping in Multi-core Systems

According to the document published on April 10, the patent covers at least two cores with different performance levels in a single processor, in which the first core can move some function to the second core. The second core would be powered up when it is needed, but it is described to consume substantially less power than the first core when performing a particular function. That idea sounds similar to what hybrid notebooks would take advantage of and it is especially similar to Nvidia is doing with Tegra 3 and its 4+1 architecture, where a fifth core is used to run certain features at a very low power state.

Intel filed the patent in December of 2008 and explicitly describes a power saving feature that is enabled by low power cores that are able to let a circuit power down much more power hungry-cores.

In Intel's description, "in [a] mobile platform, a high performance core is desired for operation when the platform is connected to a fixed infrastructure power network, such as when the mobile platform is docked to a desktop personal computer (PC) for data synchronization. However, when it is used in a battery mode, a low performance with low power consumption is preferred. A fixed microprocessor core with fixed performance characteristics may not be able to accommodate different usage conditions."

The inventors explain that a traditional homogeneous processor core architecture may not be able to achieve the best possible power savings, as cores are typically designed for high clock speeds and include, for example, many more transistors than a low-power core really needs to perform basic functions.

The patent granted to Intel fixes this problem with the idea to build different types of (x86) cores into the same processor.

Create a new thread in the UK News comments forum about this subject
This thread is closed for comments
    Your comment
  • bemused_fred
    Wow, this is neat! Looking forward to seeing this in future computers.
  • mi1ez
    Oh dear Nvidia. I can see where this is going...