Home | Projects | Notes | Misc | Events | Gitea/Github/Codeberg | Instagram

Driving a SDA5708 LED Matrix Display with Raspberry Pi Pico

I got my hands on a Siemens SDA5708 Display from an old Nokia satelite receiver. The Display is composed of 8 digits, each made of a 7-rows-by-5-columns LED Matrix. I looked it up on the internet and found these two articles. I couldn’t find a proper datasheet, but the first article gave enough explanation of how the display worked, while the second suggested to use the SPI protocol to drive the display.

I decided this would be a nice project to use my Raspberry Pi Pico, which I bought on release a couple of years ago but never got around to do anything with.

The Display

The SB-Projects article does a really good job at explaining how the display works, but let me summarize.
It has 6 pins, 2 of which for VCC (pin 1) and GND (pin 6), the middle ones used for Chip Select (pin 2), Data (pin 3), Clock (pin 4) and Reset (pin 5). The Data pin is also called MOSI (Master Out - Slave In) which is the pin that the Master (in this case, the Pico) uses to send data to the Slave (in the case, the Display). The Pico pinout calls this pin SPI TX.
SPI usually also has a MISO pin (Master In - Slave Out) used for the Slave (display) to send data to the Master (Pico). The Pico pinout calls this pin SPI RX. However, the display does not send any data back to the controller, so there’s no need to wire it.

It also has three registers: one Control Register, one Access Register and one Column Data Register.

Driving the display

After setting up the environment and the C SDK, I started studying the SPI example program. From that, I started writing my own C program. I soon found out a little quirk of the display: the bytes needed to be sent in reverse order, meaning with the Least Significant Bit must be sent first.

At first I though it might be something to do with the endianess of the RP2040 (the microcontroller on the Pico), but it didn’t really make much sense. I then came across this article, pointing to this github project which also was sending the bytes backwards, so it isn’t something on my end and must be something on the display’s end. It kinda remained a mistery for me, and thought there was some missing information in the SB-Project article, until I re-read it while writing this post, and noticed this line

The least significant bit D0 is loaded first. 

which I simply didn’t notice for like 5 times while reading it originally.

The need to send the least significant bit first simply means that each byte must be wrote backwards. With this, comes the need to reverse a given bit sequence, so that it can be sent in the correct way.
It came easy to use a lookup table to associate each digit (e.g. 0, the first digit) to the corresponding identifier (0b111), already reversed to be sent in the correct order (0b111000000). Packets can then be crafted by bitwise operations, by ORing the data with the correct header. The same can be applied to brightness settings.

When I got the test script working properly I decided to write a little C library to use the display. You can find it here (mirror).

It has all the functions to manage the display, which means:

The repo also bundles a little example to show the display capabilities. The library includes a header file that defines a font. It took about 1 hour to write the whole library, but took about 1.5 hours to write the font alone! (complete with errors and a ton of mirrored letters). That’s why it only includes the 26 letters of the alphabet written in upper-case, and a empty space character. No numbers, no lower-case letters, no special character, I was kinda tired at that point.

So if you want to submit a pull request to add more characters to the front, I’ll be glad to accept it :)

Here’s how it looks: I’ve used a dupont cable to fix it on the breadboard for the photo.