A few weeks ago on Hacker News a link was submitted to a post entitled How FPGAs work, and why you'll buy one (109 points, link to discussion). Actually the post was written in 2013, and had already been posted (379 points, see here for discussion); if you don't know what an FPGA is, I suggest you read that post as it is quite good and instructive. Now, as is often the case on HN, I find that the discussions are even more interesting than the article itself. This case is no exception, and I found one comment in particular by tjradcliffe that seemed to sum up the problem with FPGAs pretty well: "FPGAs: they're the technology of the future, and always will be!"
I couldn't agree more. I mean, think about what you can do with an FPGA: you get to create your own hardware, prototype your own chip. This is as close to the metal as it gets! And you can have lots of small "cores" running in parallel, for a fraction of the power that would be required to do the same with a CPU or a GPU. Seriously, how awesome is that?
With all this, how are we not all using FPGAs?
What's the catch?
Well, as you can see in the discussions on HN, or on this rant on Reddit, there are several reasons. One is the price (although this is less and less true, thanks to the community making cheaper boards), another one is that tools suck (and are proprietary). But I think that the number 1 reason is this:
- ease of use
You need to use one of the two so-called Hardware Description Languages (VHDL and Verilog), where you describe everything that happens at each clock cycle. And the fun part is, those languages were made for writing simulations, so before you're even able to get something running on that FPGA, you need to be aware of a ton of guidelines about what will work, what may work, what might work, and what won't; for instance, this is a 24 page guide on when to use blocking and nonblocking assignments in Verilog. This is just one example, but you can find hundreds of different coding style guides for Verilog and even more so for VHDL, because as the latter is more complex there are more ways to shoot yourself in the foot.
Many of these guidelines aim to reduce mismatches between simulation and synthesis, the process that transform your code to something that can be uploaded in the FPGA. As you can imagine, a simulation/synthesis mismatch is something quite nasty to have (the analogy in software would be a different behavior in debug mode than in release mode). Now the even funnier part is that each FPGA vendor has its own synthesis tool with its own unique subset of things they accept. Yay! This potent combination makes it incredibly difficult to do simple things in a portable way, such as declaring a RAM block.
Note that this is not new, and as far as I know, things haven't improved much in the past decade or so.
At the same time, FPGAs seem to become less and less popular over time: I recently took a look at searches for FPGA on Google Trends, and couldn't help but notice the decrease in popularity. This is relative to the total search volume, so it can either mean that the absolute number of searches for FPGAs is decreasing, or that it is still growing albeit at a much slower rate than the total number of searches.
The way I see this, existing users may be still using FPGAs because they're the best tool for the job and they are already used to the platform, but potential users are choosing other platforms instead that are massively easier to use. Just compare the trends for FPGA (blue) and Arduino (red) for instance:
The difference is not as pronounced between FPGA and GPU, but the story is the same.
How to fix it?
We think that it doesn't have to be this way. We believe that using FPGAs should be an enjoyable experience. We feel that programming an FPGA should be done with code written in a better language with an open source IDE.
What do you think? Could better languages and tools, and open source, help make FPGAs easier to use and more popular? Or will they remain a niche with a dwindling popularity?