Why It’s So Hard To Create New Processors

Many companies are interested in developing their own processors, following the success of RISC-V, but verification is a daunting challenge.

popularity

The introduction, and initial success, of the RISC-V processor ISA has reignited interest in the design of custom processors, but the industry is now grappling with how to verify them. The expertise and tools that were once in the market have been consolidated into the hands of the few companies that have been shipping processor chips or IP cores over the past 20 years.

Verification of a processor is different from the verification of other pieces of IP, or even an SoC. A processor is the ultimate piece of general-purpose hardware, and that creates its own unique set of issues.

“It can run any software program,” says Paul Cunningham, corporate vice president and general manager at Cadence Design Systems. “It is one of the most configurable deep-state devices that you can imagine. To truly say I’ve completed verification of the CPU is to say that you have run every possible software program that could run on that CPU, which of course you’ll never do. It is completely intractable. CPU verification is extremely difficult.”

In an age when hardware has to be as flexible as possible, it is prudent to do as much as you can in software. “One company found is that it’s far better to build state machines in a processor using a little bit of software than to create state machines in Verilog,” says Simon Davidmann, CEO of Imperas Software. “If you build your state machines in Verilog and get something wrong, you have to re-spin the chip. But if you build small controllers and program a state machines then you can figure out things later if there are issues.”

But that creates two problems. “Something discovered in a recent processor verification project, is that you have to involve the software team in this process,” says Hagai Arbel, CEO of VTool. “This brings a whole new set of challenges because they are speaking a completely different language, both technically and mentally.”

The second problem is verification. “A processor is only as good as its verification,” says Duaine Pryor, chief technologist for Mentor, a Siemens Business. “Brute-force solutions to verification closure aren’t feasible. We are seeing this play out in both the high- and low-end segments of the market.”

Regardless of whether a processor core is based on RISC-V or a unique instruction set architecture (ISA), thorough verification is critical. “Users can no longer count on the decades of validation in silicon they enjoyed with legacy processors,” says Nicolae Tusinschi, design verification expert at OneSpin Solution. “RISC-V adopters get no special pass; experience has shown that many open-source cores are weakly verified and contain many corner-case bugs. All processor developers must extensively verify their core designs, document the processes and the coverage achieved, and enable users to leverage this verification for the full system-on-chip (SoC) designs containing the cores.”

Some aspects of processors are common, regardless of the scale of the project. “Processor verification can be broadly divided into control verification and datapath verification,” says Daniel Schostak, architect and fellow for the central engineering group at Arm. “Neither is simple, and both have their challenges. With control verification, the main challenges are the number of interacting components and ensuring all the corner cases are covered, while with datapath verification, the main challenge is working out the boundary cases so that these can be properly covered. These challenges can be further complicated by micro-architectural features for area, performance or power.”

The problem is compounded further in the open-source arena, particularly when it comes to RISC-V. “On one side you have functional verification,” said Louie De Luna, director of marketing at Aldec. “But then you also have compliance testing. You need to ensure that any custom instructions are compliant with the source code. That affects your coverage model, and there is no de facto tool flow yet. There is still a lot of work to be done in this area. In addition to learning how to use all of the tools, you also have to see how they can be used for testing.”

Moving complexity
While processor performance has not increased in a significant manner since around 2000, it does not mean that complexity has remained constant. “We’re still seeing the complexity of the processor increasing,” says Colin McKellar, vice president of verification platforms for Imagination Technologies. “There is the increasing desire to squeeze more and more out of the process. And there’s a very strong desire to have some form of unique feature set or unique way of doing stuff.”

That desire for uniqueness is driving the current trend. “At the lower end, especially for edge devices, power and cost constraints are much tighter; price points are lower,” says Mentor’s Pryor. “This means that more tightly coupled acceleration, customizability, and even removing instructions are valuable. Smaller designs and captive software contribute to lower re-verification cost compared to the high end. RISC-V provides an easily customizable platform at lower cost. Taken together, the advantages of customizing are greater than the disadvantages, creating a new group of verification engineers faced with the processor (re)verification problem, albeit on smaller and more closed systems.”

At the high-end, other forces come into play. “Processor architecture began with more complex instructions migrating towards RISC,” says Cadence’s Cunningham. “That started flattening out and we started going into multi-core. And then we’ve run out of scaling on multi-core. At this point, all of the new levels of complexity are around customized specific architectures — different kinds of compute engines that are tailored to specific verticals, such as AI or graphics or video. The underlying complexity of the core pipeline may not be changing as fast as it was, but it’s still out there. There are more things around security. And there’s so much in terms of speculative execution.”

Speculative execution is an optimization technique that has been linked to a number of high-profile processor vulnerabilities, such as Spectre and Meldown. Security is an ongoing challenge, and it is an area in which design requirements are still be added.

“Security is a new dimension for modern processors, and ensuring that the processor enables an effective hardware root of trust (HRoT) is key,” says Dave Kelf, vice president and chief marketing officer for Breker Verification Systems. “To verify this, negative verification (proving that there is no way to access areas except the prescribed mechanism) is important. Formal is good for this, but runs out of capacity at the system level. As such, tools that allow the system state space to be specified and analyzed will be an important toolbox component for the processor verification team.”

When there is an expectation for an operating system to be running on the processor, additional levels of support have to be made available. “General-purpose processors are expected to support all the modern operating systems and a host of applications on top,” says Shubhodeep Roy Choudhury, co-founder of Valtrix Systems. “Most of these use cases require support from the hardware, such as TLB/MMU for virtual memory, hypervisors for virtualization, and FPU for floating point computation. Support for parallel threads of execution has memory ordering and cache coherency implications. Given the number of use cases, processor units and cross-products, the verification of general-purpose processors is a much more challenging task compared to the other designs.”

Paying for verification
There is no escaping the price that has to be paid to perform verification of a processor core. You can either pay the cost directly yourself, or pay someone else to do it for you.

“Top processor companies have made a huge investment and have a lot of expertise so that they can deliver something that is good,” says Cunningham. “It has to be remembered that RISC-V is an instruction set. It’s not an actual implementation of the CPU. The implementation has to be verified and must be of sufficient quality, as well as meeting certain power, performance, and area targets. Arm is not just an instruction set. They’re offering the whole ecosystem. They’re spending millions of dollars on that.”

Quality is important for processors. “Arm runs something like 1015 instructions per core,” says Imperas’ Davidmann. “It’s a lot of instructions. The average RISC-V developer or IP provider doesn’t even have hardware emulators. They’re not able to do anywhere like that amount of verification. Over the years, when people bought Arm or MIPS and Power processors, they relied on the IP vendor for the verification.”

But not all cores are that large or complex. “A lot of the RISC-V projects today are quite small and the complexity is manageable,” says Imagination’s McKellar. “The cost associated with verifying it wouldn’t be huge. But when you start improving the performance or adding complexity doing multi-threaded, parallelized stuff, it can grow quite quickly. I would imagine half of the companies that try to do complicated processors will find it to be too big a challenge, and people up the food chain will stop the funding associated with it because it was expensive and it didn’t necessarily work correctly.”

Codasip vice president Jerry Ardizzone agrees. “One of our customers was going to use an open-source RISC-V core with open-source tools, creating 108 repeatable co-processors, and they ran into several issues. First, they had to prove that the core works. Then, they realized that the only test suite available for that core was open-source, because all other commercial tools were in toolkits developed by companies like Arm, Codasip, MIPS, and Synopsys. And then they had to show that all of these processors could handle 35 instructions for very specific acceleration, which in open-source is very hard. You can buy all of that from Arm and be sure that it works. You also can build your own core, which assumes that you know how to build a processor and can make sure that it doesn’t lock up. But verification is always the big bottleneck.”

That also can prove to be quite expensive. “Verification is the hardest part and the most expensive,” Ardizzone says. “It’s the bottleneck, and it takes at least a couple quarters of work. And every time you touch the hardware, you have to re-verify it.”

For open-source hardware, lower cost is one of the key selling points. “Companies like Intel and Arm are really good at this, and you know that when you integrate it into an SoC you’ll be able to verify it works,” said Aldec’s De Luna. “Now the industry is saying this isn’t so easy. If you think about the open-source community, a lot of what happens is based on a budget. But there are a lot of pieces that need to go together. The industry needs to put together an end-to-end flow, and that only can happen with more cooperation.”

Verification expertise
Twenty years ago, there were several companies that produced tools for processor verification. At that time, most systems companies had their own proprietary processor cores. Since then, they all have migrated to one of the large processor companies, and now the expertise has become concentrated within those companies.

“Years of experience and methodology development are a huge advantage in being able to reliably verify and validate processors,” says Pryor. “We’re constantly impressed by the way processor houses drive innovation and efficiency in the verification flows.”

Consulting companies that have recently helped others to verify processor cores talk about how important that experience is. “Our prior experience and knowledge were essential,” says VTool’s Arbel. “I first started processor verification 20 years ago, but if this had been the first time talking about this type of problem I would have stumbled trying to solve the problems. Even if you have a very small or simple processor, the design verification teams need to be ready with this kind of knowledge. There are several possible solutions, and we may not have done it the best possible way. But you definitely need to approach the problem differently compared to other types of design.”

McKellar agrees. “You need both the expertise plus the methodology. They very much go hand in hand. Without the experts, the tools may not have as much value. Without the tools, the experts will struggle because the tool kit will be too weak. There’s a shortage of high-class verification engineers in the world and a lot of companies are competing with each other for the required skill set. Many of them do not have enough depth for what they need to do.”

And getting that knowledge is difficult. “How to test a processor has basically become closed and encapsulated inside those basic three or four big vendors,” says Davidmann. “There just aren’t lots of papers or publications or tools out there that can help.”

Helpful advice
Cunningham offers this advice. “First and foremost, go hire somebody. Make sure you know if you really want to go and build your own processor, and if you need to, take it seriously. It’s still a big deal. You need someone to lead and to own that.”

It all starts with planning. “Be sensible about what you can actually achieve and the timelines that you need for your time to market,” says McKellar. “Be focused on trying to limit combinations and reduce features. Some high-level features or combinations of features may not be that valuable to the end customer, but will cost you a lot of time and effort to verify them correctly. Be quite open-minded because there is no one answer. There’s not one thing that fits all. Reviews are hugely important. Having independent reviews and accepting critique is very important. And be very mindful of testing new stuff early. You should be doing the new and hard stuff as early as possible, spending a lot of time and effort on that and less time and effort on the older stuff.”

And also ask yourself why you are doing it. “Unless you’re trying to really do something innovative and custom, good solutions already exist,” says Davidmann. “If you’re trying to add some very interesting fabric things or custom instructions, then maybe that’s why you should go down the RISC-V route. More people are building their own or configuring their own processors, which means much more complexity, much more verification, and much more opportunity for the verification industry.”

Open-source verification
RISC-V certainly has advanced the notion of open-source hardware, and some are questioning if they also can expect open-source verification to emerge from this. “The success of an open-source model requires an infrastructure that enables real designs and products to be created, put into manufacturing and delivered to market,” says Bipul Talukdar, director of applications engineering for SmartDV. “A key piece of the necessary infrastructure is a RISC-V verification platform that accurately verifies designs using the open-source specifications with the CPU executing the ISA.”

Some see hope. “The RISC-V community is working together to help find solutions,” says Kevin McDermott, vice president of marketing for Imperas Software. “Perhaps this is best illustrated by the work at Google Cloud to develop and enhance an open source project for a RISC-V Random Instruction Stream Generator that makes use of the free riscvOVPsim reference model.”

Still, it is highly unlikely that core EDA tools will be replaced by open source. The real cost is not the EDA tools. It’s the complexity, and large processor vendors are still finding unexpected bugs and vulnerabilities despite years of experience with these issues.

The second part of this processor verification series will concentrate on emerging methodologies and tools being used by early adopters.

Related Stories
Verification Knowledge Center
Repository of top stories, special reports, white papers, blogs and videos
Making Sure RISC-V Designs Work As Expected
Open-source growth predictions are impressive, but the verification process can be harder than with commercial ISAs.
Will Open-Source Processors Cause A Verification Shift?
Tools and methodologies exist, but who will actually do the verification is unclear.
RISC-V Markets, Security And Growth Prospects
Experts at the Table, Part 1: The advantages and limitations of a new instruction set architecture.
RISC-V Challenges And Opportunities
Who makes money with an open-source ISA, the current state of the RISC-V ecosystem, and what differentiates one vendor from the next.
Will Open-Source EDA Work?
DARPA program pushes for cheaper and simpler tools, but it may not be so easy.
Open ISAs Gaining Traction
Emphasis on flexibility, time to market and heterogeneity requires more processing options



1 comments

Tayo says:

No mention of Synopsys ARC Processor IP despite being the #2 player behind ARM providing configurable implementations that go through verification RISC-V hasn’t been able to achieve yet?

Leave a Reply


(Note: This name will be displayed publicly)