Intel Proposes x86s to Drop Backward Compatibility – Please Be True
22-05-2023 | By Robin Mitchell
A recent announcement from Intel could see newer processors drop backward compatibility with older 16-bit systems, which would help remove redundant systems and irrelevant hardware. What challenges does backward compatibility introduce, what would the x86S proposal do, and why is this definitely a step in the right direction?
What challenges does backward compatibility introduce?
As an engineer, one fact regarding modern processors continues to grind my gears to this day; backward compatibility. Simply put, modern Intel x86/x64 processors integrate numerous backward compatibility circuits and architectural concepts so that code during the 16-bit era is still able to run.
For example, when computers boot, they only load the first 256 bytes of a bootable medium and search for a specific byte pattern at the end of this block to see if that code can be booted. At the same time, during boot, modern processors start in 16-bit mode before switching to 32-bit or 64-bit (however, this may depend on whether a legacy or UEFI boot mode is being used).
Another example of backward compatibility still found in modern processors is the integration of code and segment registers. These registers were initially included to allow processors to access different memory banks separated by execution code and data, but their 16-bit size limits what they can do. In contrast, modern 64-bit processors designed in the 21st century, such as ARM, do not use such registers.
So, it is clear that there are numerous features in modern processors that were designed for a bygone era, but what challenges does retaining such hardware present?
Firstly, by keeping such old hardware requirements, it can have a severe impact on a device's microarchitecture and design. For example, improvements in booting and code execution cannot be integrated if they negatively impact backward compatibility.
Secondly, devices that require backward compatibility need to devote silicon space to ancient hardware, something which could better be utilised by new hardware accelerators and execution systems. It is also highly likely that the use of backward compatibility will also have wider implications for the hardware used in modern designs.
Thirdly, maintaining backward compatibility can also see security vulnerabilities persists in new designs. This is especially true during the boot process, where legacy systems may not utilise strong security practices, allowing any malicious code to execute. Trying to fix such issues could break backward compatibility, thereby introducing numerous design challenges to engineers.
But above all else, the single most important challenge introduced by backward compatibility is that it limits what hardware and software can do. As the nature of software and applications changes, the need for new hardware circuitry, accelerators, and even microarchitecture becomes critical. But maintaining backward compatibility limits what engineers can change in processors, and this can see processor architectures stagnate and become inefficient while trapping software developers in an ageing processor architecture.
Let's consider a real-world example to illustrate this point. Imagine you're a software developer working on a cutting-edge application that leverages the full capabilities of modern 64-bit processors. However, because of the need for backward compatibility, you're forced to consider how your application will run on older 16-bit systems. This not only limits the features and capabilities you can include in your application but also adds additional development time and complexity as you have to ensure your application can run on outdated hardware. This is a common challenge faced by many developers in the field and serves as a clear example of how backward compatibility can hinder progress and innovation.
Intel suggests that the proposed x86S could eliminate backward compatibility
Recently, Intel published a whitepaper describing how a new x86S architecture would help engineers and developers by eliminating the need for backward compatibility and redesigning Intel architecture for the 21st century. While the document is only a proposal with no promises, it shows that Intel is starting to recognise the challenges of backward compatibility and how many architectural features of the past still persist in modern design.
For example, Intel recognises that the booting method on modern devices is archaic due to the use of 16-bit mode on boot (something which is never used in modern applications). Intel's own vision for a simplified architecture, as outlined in their recent whitepaper, supports this perspective.
The proposed x86S architecture would require 64-bit equivalents of technologies that currently run in either real mode or protected mode. This includes booting CPUs (SIPI), which currently starts in real-address mode and needs a 64-bit replacement. The proposed architecture would also allow for the use of 5-level pages without leaving a paged mode. Intel believes that these modifications can be implemented with straightforward enhancements to the system architecture affecting the operating system only.
Furthermore, Intel is exploring the benefits of extending the ISA transition to a 64-bit mode-only solution, which could potentially reduce the overall complexity of the software and hardware architecture. Intel also points out the challenges with code and segmentation registers needing to be backwards compatible and proposes that a new architecture would use simplified segmentation models. Additionally, ring 1 and 2 features can be removed due to their lack of use in modern designs, as well as numerous other features, including 16-bit address support, I/O access, and operating system mode bits.
Fundamentally, the white paper proposes that a true 64-bit processor could be developed that drops all hardware relating to older systems, with minimal support for 32-bit applications, as these are still highly relevant today.
For those who wish to delve deeper into the intricacies of Intel's proposed x86S architecture, I highly recommend downloading and reading the full whitepaper. It provides a comprehensive overview of the architectural enhancements and modifications that Intel is currently investigating for a 64-bit mode-only solution. You can download the whitepaper directly from Intel's website. It's a fascinating read for anyone interested in the future of processor design and the potential benefits of eliminating backward compatibility.
Why is this a step in the right direction?
The year is 2023, and it is utterly shocking that next-generation processors are still able to execute 16-bit code only found on extremely old operating systems such as MS-DOS. Not only is it possible to get such operating systems to run, albeit with some adjustments to hardware and software, but old storage devices such as floppy drives can be connected to modern computers and have it boot from the first 256 bytes using ancient compilers.
By dropping backward compatibility, Intel can re-engineer its devices to fully utilise the silicon that they use, providing engineers not only with better architectures for modern applications but even reducing energy consumption and increasing efficiency. Of course, there will undoubtedly be those who operate in obscure and niche applications that will require backward compatibility, but for the vast majority of software and hardware engineers, a change in architecture cannot come too soon.
Unfortunately, this whitepaper is just that, and there are no promises that such an architecture will be developed. However, if the industry response is overwhelmingly positive, then it could be the start of a new era of Intel processors.
Looking ahead, the shift towards a 64-bit mode-only architecture could have profound implications for the future of processor design. As software continues to evolve and demand more from hardware, the need for processors that can fully leverage the capabilities of modern software will only grow. We may see a trend towards more specialised processors designed with specific applications or industries in mind. For instance, processors optimised for AI and machine learning applications or for handling the vast amounts of data generated by IoT devices. On the other hand, the elimination of backward compatibility could also pose challenges, particularly for industries or applications that still rely on legacy systems. Navigating this transition will require careful planning and consideration, but the potential benefits in terms of improved performance, efficiency, and innovation make it a promising prospect for the future of the industry.