The fear of change and its actual cost

A few days ago I spoke with a potential customer (technical leader at a big semiconductor company) who was afraid of changing the design flow of her hardware teams. She was neither the first - nor will she be the last - reluctant to change anything to an established design flow because of supposed risks we often hear about. As a result, this company only uses standardized languages, standardized tools (I still do not understand what a standardized tool is!), standardized flow, and so on. On one hand, and because I am a hardware designer, I could appreciate the safety of a "full" standardized design flow. On the other hand, and because I am also the sales manager at Synflow, I wondered how much does it cost to keep using the same flow when alternative approaches could be more productive?

Safety first

First of all, it may sound strange but I agree with this prospect about the safety of a usual RTL flow. As I have said on many occasions I love RTL design and verification, and in my opinion the software provided by Synopsys, Cadence, Mentor Graphics, Atrenta, and so on are perfectly effective in a RTL design flow. Without a doubt, the good old design flow that starts from a specification and ends with a functional product on an ASIC (or a FPGA) - creating a high-level model/reference software, creating/optimizing/verifying the RTL design, updating the model/reference software, etc - offers the safety of years of professional habits and allows managers to sleep at night.

I have used this flow myself for years because I was certain it was the most efficient flow for hardware design. Indeed, I tried alternatives like the C-to-gates compilers that EDA vendors started shipping ten years ago... and honestly I thought "are you kidding me?" The input language was pseudo C that looked nothing like the C you write for software, and had all sorts of weird mnemonics, attributes, and other macros. Other vendors supported SystemC and presented it as the final answer to RTL complexity, but it seemed more complex to me than Verilog or VHDL and was not really suited to design hardware (more details about SystemC for hardware design). Worse, the QoR was lower than I expected, the code generation unpredictable, and the generated code unreadable (more here). I concluded that nothing could beat the RTL design flow.

Over the years, I have been reading more and more articles about the complexity of the RTL design flow, and since we founded Synflow, I have spoken with numerous RTL designers who agree that RTL design is tedious, complicated, and inefficient to deal with the complexity of today's SoC and IP cores design. But, if RTL is so bad, why does the majority of designers stay at that level? Everyone should have started to design differently, or use new languages with a higher level of abstraction to reduce the amount of detailed design work required. Historically, RTL design went from hundreds of pages of gate-level schematics to thousands of lines of RTL, so what has happened since then? The answer is simple: because the costs of hardware design have skyrocketed, and High Level Synthesis has failed to meet expectations, the fear of change has triumphed over the desire to keep moving forward.
The hidden cost of too much safety

The actual cost of maximal safety

All right, so the maximal safety now seems to be the default choice in the semiconductor industry with standardized design flows and software only supplied by well-known EDA companies... My question here is: what is the actual cost of this? This is not an easy question since there are a lot of hidden costs that one do not consider (or maybe do not want to consider since it is easier not to change anything). For instance, what is the overall cost of complex backup and recovery of RTL (and sometime gate-level) code? Of inefficient project storage without proper version control? Of employees that use a bad coding style? And more generally what is the cost of the loss in competitiveness due to the use of (standardized) languages unfitted to hardware design or ageing languages? Even though it starts from the best of intentions (to secure the process of designing IP cores and SoC), safety margins create unnecessary expenses.

We discussed a few months ago with prospects from one of the biggest company specialized in the design of wireless communication devices (e.g. WiFi, 3G/4G modems) on ASIC technology. To create a 3G modem from scratch, around 50 experimented engineers/month were necessary with a per-engineer labor cost of $100k/year. The design team used the design flow I introduced earlier (creating a high level model, creating the RTL design ...) because it was the only way to get the QoR required by their customers. The cost of this design was thus around (50/12)*100k = $400k. Now let's imagine that they introduced within their design flow a new software that makes the development of SoC and IP cores easier, reduces the risks of bugs, and reduces verification time, resulting in a productivity increase of approximately 5 to 10 depending on the complexity of the SoC. If you do the math, this would reduce the cost of that same design from $400k to between $40k and $80k - simple as that. Cost of the safety? More than $300k.

Pretty impressive, wouldn't you agree? And what happens if something goes wrong (the well-known infamous risk)? Let me debunk a generally accepted idea: no need to be a big group to provide support, in fact start-ups and small companies are more likely to solve any problem their customers might have and to improve their products way more quickly. Such an approach would be worth trying, given the constant pressure on SoC and IP cores designers to come up with something better and even cheaper than what the state-of-the-art was only a few months before. How long can designers manage the increasing complexity with the same old design flow? Designers and engineers are already working to their limits and at the end who will have to pay for the maximal safety: customers, or companies that refuse to move forward?
The hidden cost of too much safety


The fear of change and its actual cost is a fascinating topic. I spoke about this issue with the semiconductor industry because it is my area of expertise but it is also the case in some other conservative industries. By seeking safety, companies ensure - in the short and medium term - that they will continue to fulfill their contracts and serve their clients in an effective and timely manner. By seeking too much safety, companies penalize themselves, they act in the same way as their competitors, use the same methodology, the same software, and the same design flow. As a result, they lose their competitiveness, become rigid, obsolete, or susceptible to the Innovator's Dilemma and replaced by leaner and more agile companies. My main concern is the consequence of the fear of change: it stifles innovation and discourages entrepreneurship, which could result in a downturn in the EDA and in the semiconductor economy.

From here on, I see two possibilities:

  • semiconductor companies stay risk-adverse and keep buying from the same vendors, leading to a future where EDA is completely consolidated with the Big Three or Two having acquired everybody else, or

  • semiconductor companies become more agile and more open-minded towards start-ups, and contribute to making the EDA ecosystem much more diverse and dynamic, and allow innovation to thrive.