Wednesday, March 21, 2018

FPGA selection for beginners

I'm not particularly expert on FPGAs, but I feel like sharing some of the thought process and research I did in getting into tinkering with FPGAs to help other people contemplating such things.

Why FPGAs?

FPGAs are the go-to solution for "digital things that can't be done with Raspberry Pi, Arduino, or other microcontroller plus software." If you want more than a handful of discrete digital devices in your project, an FPGA (or its smaller cousin, a CPLD) can be whatever collection of gates you need. From a more industrial perspective, they are a stepping-stone to the kind of large-scale digital design that goes into developing ASICs and high-powered DSP solutions.

I'll probably discuss later some of the things that are probably not suited for FPGAs and assume for now you at least understand what they are, what they can do, and you just really, really want to use one. You want to buy an FPGA for good reasons and can't decide which: how to choose?

Choose a project before a board

I would first suggest you have at least some kind of concrete project idea in mind first. If your idea goes beyond "blink an LED", you are probably going to need at least some basic connections to other things in the real world, and that constrains your selection.
  • Do you want images to appear on a VGA or HDMI display?
  • Do you want to capture still images or video from a camera?
  • Do you want your thing to connect to an Ethernet, WLAN, Bluetooth, or other network?
  • Is your project going to connect to a USB keyboard, mouse, or itself be a USB peripheral?
  • Do you want to capture or produce audio?
  • Do you want more than 10 digital I/O?
  • Will your project require MB of RAM?
  • Is there going to be a large software component to your project?
This doesn't mean you need a full project plan and design in hand. Maybe you can look at some of the many projects demonstrated on blogs or YouTube videos and say "I'd like to have one of those."

FPGA development has a higher barrier both in terms of cost, difficulty, and complexity than a Raspberry Pi or Arduino. If you don't have a goal in mind, it's harder to maintain your motivation. Furthermore, if you pick a board without these constraints in mind, you might well have to spend another $100 or more on the right board before you can do anything useful.

Most of the constraints are straightforward: if you want to connect to Ethernet, HDMI, or USB, having the appropriate PHY components on board is perhaps essential, and hard to add after the fact. FPGAs are typically weak in A/D and D/A. They have massive I/O but not all boards provide them.

The "external memory" and "software component" points get at some of the broad divisions I would make in the FPGA landscape.

SoC or non-SoC

First, the difference between "SoC" and non-SoC FPGA architectures. SoC stands for "system on a chip", but in FPGAs invariably refers to "hard-core processor." A Xilinx Zynq device includes one or more ARM cores (and a bunch of peripherals) integrated with the FPGA fabric. Intel/Altera do the same, and use an "SoC" qualifier in the product name. These devices mean it is trivial to run Linux or another ARM operating system on the devices without using up any of the logic gates. In contrast, if you don't have an ARM core, but want traditional processing available, you will have to give up a bunch of FPGA resources to a "soft-core" processor like the Xilinx Microblaze or Altera/Intel Nios, which will be much slower than the ARM, and be less compatible with other ARM development.

SoC means you have a powerful processor for free, but also means you need to deal with additional complexity in the project development environment. A plain FPGA can literally be configured as a single NAND gate or 24-bit counter in a dozen lines of code, and get LEDs to do things that depend on switches. An SoC device needs some minimum configuration to put the hard cores into a reasonable state. Many of the external connections on the SoC device board will be dedicated to the ARM core or its associated bus. The vendor IDEs that control these things are made for complex SoC projects, exporting the SoC configuration to independent software teams with GUI-based wizards and huge pages of options, which gets in the way of beginner HDL development.

At this point, if you are thinking "sure, I need all of that to develop my ARM-based operating system" you should double-check that you don't really want a cheaper, lower-overhead solution like a Raspberry Pi. In particular, if you are thinking "I want to design my own ARM CPU," you probably don't. ARM is complex, and protected by ARM lawyers. If you manage to design an ARM-like CPU without infringing on patents, it will be dog-slow compared to a hard core version. Finally, if you really are going to design your own CPU, the software effort will be comparable to the digital design effort, and you will want your own software-based emulation of your architecture to do that. Which brings up the part of the process known as "FPGA Hell."

Your board won't work until it works without a board

Almost certainly, your first project will be downloaded to an FPGA board and literally nothing will happen. Well, something will happen, but you won't see it. You might not even be able to measure it with an oscilloscope or logic analyzer. How do you proceed? Well, you double-check your simulation results and enhance your test bed to check for problems in your design. Your $100 FPGA board was no help. In fact, before you get it to work on that board, you will have to spend a bunch of time figuring out how to simulate and verify your design without touching the hardware. So why not try that first before you spend any money at all?

Working without a board

Download one of the free development environments. From Xilinx, the Vivado WebPack is what you want, the older ISE is necessary if you are targeting old Spartan 6 or other pre-"7" product lines, but you probably should avoid that. For Altera/Intel devices "Quartus Prime Lite" are freely available. Or, try a free HDL environment like GHDL or Icarus Verilog. Figure out how to write a simple design like a counter and verify that it works by coding an HDL test bench or by looking at simulated digital environment. Maybe the desire to use an FPGA will wear off, or at least you will have a few more days to think about which board you want.

Getting back to board selection, external memory is a big differentiator in FPGA boards. Adding a high-capacity RAM to a board that doesn't already include it is impossible or not worth the trouble. If your project needs MB of RAM for high-level software or to hold the images, video, or audio data your device is processing, you will want that on the board. FPGAs with GB of RAM in the package do exist, but you can't afford them for home use, or the license for the development tools that support these high-end chips.

High-performance is not hobby territory

Generally speaking, if you need to use high-speed digital connections like PCI or LVDS, you will be looking at high-performance chips that are not supported by the free editions of the software. You likely won't be able to afford the chips or development boards, either. And, no, mining cryptocurrency will not pay for your hobby: cryptocurrencies are now being designed specifically to defeat FPGA implementations, and real ASICs that cost many thousands of dollars are sucking away all available profit until the bottom falls out.

Other board criteria

Another differentiator in these boards is design quality and the availability of tutorial and educational material to get started. Digilent (for Xilinx) and Terasic (for Intel/Altera) seem to be the primary vendors targeting the academic market: you get academic discounts if you have a .edu e-mail address, they come with plenty of documentation and sample projects, and are used by lots of university classes with their own material online.

Cheap online marketplace knockoffs and eBay used boards are to be avoided. You don't want to have to reverse engineer where an unknown, inaccessible engineer took shortcuts or had a component changed because it was a  few cents cheaper in the Shenzhen market. A used board is likely going to be an older FPGA that might not be supported by the current development tools.

Development tools tend to be very finicky about exactly which OS host they are on. I ran into real problems using a 2016 Vivado release on Ubuntu 16.04 because of a showstopper bug on Xeon processors, where the solution was "use Ubuntu 14.04 or wait for the next Vivado release." Be prepared to dedicate your development machine to match what the FPGA vendor wants. Synthesis and place-and-route can easily demand 8 GB of RAM or more. I build a dedicated Linux machine for this purpose, and running a VM on an Intel Mac was significantly slower.

A few random boards

Digilent Arty A7: this is what I bought. $99 is a pretty good price, it comes with external RAM and Ethernet. Somewhat limited I/O. The Artix is the non-SoC Xilinx line, so the "run Linux" examples are using the Microblaze architecture, and you use a noticeable fraction of the FPGA on the memory interface.  I don't understand the appeal of the S7 version which lacks the Ethernet in exchange for more logic.

Digilent Basys 3: targeted at education, but has no external RAM. In my mind, that severely limits what you can do with it. It has an external VGA port built-in, but where are you going to store the images to appear on the screen?

Digilent Zybo Z7: This is an SoC Zynq based line, and is the newer replacement for other Digilent Zybo chips. You get the ARM core(s) plus a bunch of peripherals, but it's pricier.

$200 is my rough ceiling for "you can't afford the project even if you can afford the board." If you are experienced enough to know why you want an expensive board, you don't need my advice.

Terasic: I find their product line confusing. But the same principles apply: SoC variants and external RAM and peripherals take on lots of different combinations. If you need to pay more than $200 on the board, you probably are buying too much board.

Numato: they have a couple intriguing sub-$50 boards (Elbert v2, Mimas v2), but use older Spartan chips. 

Nandland Go Board: a good effort for introductory materials. I think they are very limited in peripherals and capability, and you are probably better off with a slightly pricier board.

Lattice "ICE40" and similar boards: Lattice is a third- or fourth-place vendor in the market. The main attraction here is "completely free development toolchain" which is a defensible ideological position, but the chips that support that ideology are small and limited compared to the Xilinx and Intel/Altera options.

Xess.Com: has a couple of compact boards, and a number of interesting PMod expansion boards. All open-source designs, and a introductory book FPGAs, Now What? [PDF] which I found useful even though I was retargeting it to my Arty board.

Other introductory material

This is a very complex technology, where the technology goes through a bunch of product evolution, so material goes quickly out-of-date. The educational board vendors (Digilent & Terasic) seem to do a pretty good job of maintaining introductory demonstration projects.

I found it useful to use the Xess "FPGAs, Now What?" book, along with Free Range Factory's "Free Range VHDL" book. Peter Ashenden's The Designer's Guide to VHDL is a thorough introduction to the VHDL language which was useful as a follow-on. I was somewhat disappointed in Blaine Readler's VHDL By Example, which seemed to veer away as soon as it approached any tricky part of the language.

I think a strong case can be made that System Verilog is preferable to VHDL. The main drawback of Verilog is a much weaker approach to typing, which allows it to be more concise as well. It superficially resembles C, but resemblance to software languages is more of a trap than a help. One must always remember that the process of FPGA development is designing hardware (on which software may run).

Thursday, April 14, 2016

1213486160 or 1347703880

As a public service, I would like to remind everyone that 1213486160 == 0x48545450 is 'HTTP' as a 32-bit integer on a big-endian machine using ASCII or UTF-8. 1347703880 = 0x50545448 would be 'HTTP' in little-endian 32-bit format. Neither is likely to be a packet length.

Monday, January 21, 2013

Converting ITS documentation

I've been playing around a bit more with the Incompatible Timesharing System (ITS). I figured out enough networking on my Mac to be able to use the Java Supdup terminal application to interact with KLH-10 simulating the KS10 architecture.

As part of my study, I have also been converting some INFO files to Texinfo, starting the with the file (slow link to ITS system) INFO;DDT > which Bjorn Victor converted to an HTML version: DDT Primer). I like Texinfo because it can (theoretically) print nice looking books for an ebook reader like my new Kindle Paperwhite, as well as Info compatible for Emacs navigation, and HTML.

Actually distributing these files is problematic. ITS generally seems to have operated with a pretty free notion of distribution: send a magnetic tape to pretty much anyone with a PDP-10 who asked, without executing any kind of explicit license agreement, or even having copyright notices on the text. Anyone with access to the system could take it upon themselves to edit these files. It is pretty much impossible to tell who wrote or collaborated on these files, at what time, for whom they were working, and what their employer (mostly the AI Lab at MIT) agreed to allow. Some of these files were electronic versions of AI memos and other MIT publications.

MIT apparently released a fraction of the ITS code (not enough to actually use, and without the documents) under GPL, but fuller distributions have been made informally, probably originating with the former operators of the ITS systems. Some initial versions of these dumps had a bunch of personal e-mail files and the personal data in the user data base, and later versions were scrubbed of most of this information. Even these scrubbed versions are generally unavailable, though this may be from negligence in the web hosting rather than the result of a legal takedown request.

Technically speaking, the ebook conversions are still a bit messy: I am using dbtoepub (a Ruby script converting Docbook to EPUB, and a 'texinfo-to-mobi' shell script which invokes makeinfo, dbtoepub, and Amazon's kindlegen binary to create a Kindle document. I have issues with the Table of Contents, Chapter headings for untitled chapters, and links from the index not navigating to the ideal place. I also found Emacs Info doesn't like it if I use a UTF-8 multibyte character: it seems to improperly count by characters where makeinfo counted by bytes.

I am using UTF-8 to provide a (SAIL-style?) lozenge rendering of "Altmode." Altmode is ASCII code ESC (27 decimal, 33 octal, 0x1B hex), although sometimes it is octal 175; it seems ASCII and terminals changed what they meant. Much of the time, Altmode echoes as a dollar sign $. In Supdup (see RFC 734), octal 4033 is apparently the ◊ lozenge character, and *that* can be used to echo Altmode. I like the lozenge better; it definitely stands out in Courier-style script more than $ does. Unicode denotes this as U+25CA. The TeX side of Texinfo doesn't really "get" Unicode, or even very much "funny characters," but I was able to find a TeX macro in plain.tex for \diamond that I could hack into my texinfo document and my conversion tools translated this to Unicode acceptably.

The "keystroke" index is a bit funky: it doesn't like my altmode characters. Metavariable notation using <var> is not supported by the semantic markup: info would show it as VAR, which I think is a really bad choice when all the other command keys are in upper case. Likewise for Control characters; I'd like ^Z, etc., to look nice, and be semantically understood for indexing, but not use C-Z or C-z notation. I'd like to be able to specify initial Texinfo variable settings from the command line (so that I could say 'makeinfo --altmodechar=$' or '--altmodechar=\033' to render my Info files in other ways), but I can only specify boolean flags that way.

Thursday, December 13, 2012

Basic ITS hacking

OK, so ITS has only one level of directory in its file system, otherwise known as the SNAME portion of a filename.

How do you create a directory if there isn't one already? For instance, if you login as an unknown user JAO and ITS complains it can't find MD:USERS1;LOGIN JAO, and Emacs won't let you create a file USERS1;LOGIN JAO because USERS1; doesn't exist, what do you do?

I expected there to be a DDT command to create a directory. But there isn't one. I found a cryptic comment to "look at the documentation for the OPEN UUO." So I did. Examining SYSDOC;_CALLS.124
The file names  ..NEW. (UDIR)  cause a new directory
to be created with the given sname if none already
exists.  Creating a directory in this way causes a
message to be printed on the system console.
(A directory is destroyed only when the disks are
salvaged by the stand-alone salvager, which is generally
run just before the time-sharing system is restarted.
A directory is then destroyed iff it contains no files.) 
I.e., a magic file name which, if opened, causes a new SNAME to be created. I used Emacs Find File: ^X^F USERS1; DSK: ..NEW. (UDIR) (note that Emacs is a bit scrambled in how it presents file names) but you probably should do something simpler like asking DDT to
TODO: Document here what I did for PWORD, PANDA

Still to puzzle out: what to do with INQUIR and how to simulate ACOUNT if I have already logged in as the user.

Getting KLH-10 and ITS to work under Mac OS X

I've cycled back to being interested in the Incompatible Timesharing System (ITS). In the interim, I replaced my G4 iBook with a Intel-based MacBookPro. With a bit of hacking, I was actually able to get local networking to communicate with the simulated PDP-10, and thought I would record some of the details.

I needed to
  • Install tuntaposx. I used the macports version.
  • Crudely hack the ks-base-its build in klh10-2.0h from panda-dist (and additional patches) to define KLH10_NET_TUN=1 under Mac OS X. I need to clean up my own sources and put them up on github. Also, I needed to hack around the removal of "mtio.h" from Mac OS X 10.6, borrowing the minimum definitions from FreeBSD headers. I guess Apple decided they really didn't want tape drives connected to their machines. Maybe it is still in OS X server?
  • There are a few rough edges in tun/tap support. Following a hint I found online, I open a bash shell as root, and execute
    • exec 4<> /dev/tap0 # create the interface by opening the device
    • ifconfig tap0 # .100.110 is the address for the KLH10, 200.105 is the host Mac OS X network address
  • In KLH10, configure dpimp parameters to open the tap interface address
    • devdef imp  ub3   lhdh   addr=767600 br=6 vec=250 ipaddr= gwaddr= debug=0 ifc=tun0 dpdebug=0 dedic=true doarp=true
  • Boot ITS using the MD image, built to expect as the IMP address and dumped.
    • set the value of the environment variable KLH10_HOME
    • cd ${KLH10_HOME}
    • ./kn10-ks klh10-md.ini
  • Once ITS is up From another shell, I can then 
    • telnet
    • :TCTYP vt52 # convince ITS I am not using a hardcopy terminal
  • Using the Java supdup client from Bjorn Victor's site, I can
    • File > Connect
I'm still a bit foggy on what dpimp is doing with /dev/(tap|tun)0 and arp commands. The arp seems to fail, but the network still is up. It isn't accessible to other computers on my LAN, although (surprise!) my Mac OS X firewall seems to have been turned off. Ignoring for now that ITS is a dangerously trusting Internet host, I think it should be possible for ITS itself to respond to ARP. What I don't understand yet is whether tap/tun will show the ARP requests from other machines to ITS, whether ITS knows how to reply to them, or whether I need to get the Mac OS X network stack to either route external packets for .100.110 to appear on the tap interface, broadcast that routing information, or reply to ARP requests on behalf the the ITS machine.

What I really want is a chapter suitable for inclusion in Steven's TCP/IP networking book that explains tun/tap and how networking with virtual machines can work. The existing documentation for these kinds of things tends to be "here's a script that works" and not solid explanations.

Friday, December 23, 2011

Restoring the Heathkit Jr 35

Heathkit Jr Electronic Workshop "35", Model JK-18

The Heathkit Jr 35 evidently had been stored with batteries in the battery holder, which corroded badly. I replaced it with a plastic RadioShack 4 D-cell battery holder (270-389), wired into the negative power rail and the hot side of the power switch (which is part of the variable resistor).
Corroded battery holder
Replacement battery holder mounted
I mounted the replacement battery holder with a dozen "heavy duty" 1 inch foam double-stick mounting squares (rated to hold 900g), stacked in four groups each three high to overcome the various screws protruding through the main board.
The remote speaker station

The rubber foot used to protect the relay from being crushed when you turn the kit over was stuck to the remote speaker station. I put it in its rightful place.
Relay with the brown rubber foot
Earlier, I had built a couple simple circuits, which showed the meter and new battery pack working, but showed me the telegraph key switches were not making good contact. I used a Scotch-Brite scrubbing pad to remove corrosion from the key switches on the main board and the remote station.

I ran through the tests in the manual appendix to check out the lamp, power switch, speaker, earphone, antenna coil continuity, remote station speaker & telegraph key, slide switch, and relay.

I then jumped ahead of myself, building the 4-transistor AM radio experiment; I got slight hints of a audio signal if I wiggled and touched some of the wires. I'm highly suspicous of the electrolytic capacitors. I probably should have tested those (and the transistors) before using them in such a complex circuit. The wire connections to the springs are not all that reliable, and the wires might have some oxidation on them.

Saturday, October 1, 2011

Heathkit Jr. 35

I just picked up a Heathkit Jr. "35" circuit experimenter kit on eBay. I had one of these as a child; I don't remember how it is that we came to own it, but I had an itch to re-read the manual. Buying the manual from would have cost as much as the kit + manual. I remember being befuddled by the descriptions of how transistors worked. It turns out the descriptions of capacitors and transistors were a bit sketchy: I think it is pretty much impossible to understand how the AM radio experiments actually function from the text. The operation of the ferrite-core antenna and the various windings are unexplained as well.

I was a bit surprised to find all four transistors in the kit are identical. Thanks to this Heathkit part cross-reference, I see that they are part 417-118, a.k.a. 2N3393 NPN transistors.