►
I have a new mechanical keyboard on order, which lets you hot swap the keyboard switches. When you do that, it's a good idea to do a quick test to see if all the keys still work. There are many tools to do this, but I decided to make my own web-based one. I think it came out pretty nicely.
▼
I have a new mechanical keyboard on order, which lets you hot swap the keyboard switches. When you do that, it's a good idea to do a quick test to see if all the keys still work. There are many tools to do this, but I decided to make my own web-based one. I think it came out pretty nicely.
You can see if the keys work by typing them one at a time, or (usually) by running your finger along each row.
In addition to that, you can also see how much key rollover your keyboard supports, i.e., how many keys you can type at the same time. Higher end keyboards usually support "n-key rollover" (NKRO), which means that any combination of keys will be recognized correctly. This is useful for some types of games.
Old keyboards usually don't support any additional rollover, so they can only correctly recognize arbitrary combinations of two keys. (This limitation doesn't apply to shift/control/alt or command/windows.) Additional keys may or may not be recognized. However, it doesn't really matter too much if you can't type ZF6. For non-gamers, what matters is that when you type quickly, it can happen that one finger hasn't released the previous key yet while another finger is already pressing the next key. Turns out that on many keyboards that don't support real 3-key rollover, you can still type three (or more) keys at the same time, as long as those keys are all on different columns. Which means: you type them with different fingers. After all, how are you going to type E, D and C at the same time, as touch typists type all three of those with their left middle finger?
A common additional limitation (to avoid more complex USB communication) is a 6KRO limitation.
What the rollover tests (for 3 - 7 KRO) do is test combinations of all the regular alphanumeric keys and most punctuation and then tell you if it looks like you have full key rollover, enough for touch typing or no real rollover for that number of keys. You can of course also just press as many keys as you like at the same time and see if they register.
I recently got more interested in cooking, so I started looking for
recipes on the internet. Then I found out that in the US, it's customary
to list the amounts for many ingredients in
teaspoons,
tablespoons
and cups,
in addition to using
pounds,
pints,
quarts and
two types of ounces.
So I decided to make this page that will let me (and you) convert between
these different measurements and the units the rest of the world understands:
milliliters and grams. I wanted to make this easy so you could do it
on the go on a phone or a tablet while cooking, hence the sliders rather than
having to type in numbers.
This was the first time I used Javascript for a significant amount of functionality,
and that was actually relatively easy.
Porting it to Python turned to be too large of a project, not helped by the many loops and multidimensional arrays. So I went looking for a BASIC interpreter for on the Mac, and found PC-BASIC, a cross-platform GW-BASIC emulator.
As luck would have it, ZEESLAG.BAS contains the BASICODE subroutines for GW-BASIC, so the program runs as-is. The only issue is that the program runs way too fast. You'll want to add a couple of zeros to the delay loops on lines 20160 and 20560. (Delay loops? Yes...)
The next step was to see if I could run the whole thing from the command line on the terminal. PC-BASIC does let you run it from the terminal:
But... only if your terminal settings are recognized. A quick export LC_ALL=C does wonders here. However, the program didn't recognize the terminal settings for the VT420, and I couldn't find any setting that worked. Until I realized I could run the program in screen, and that worked:
export LC_ALL=C
export TERM=vt420
screen
/Applications/PC-BASIC.app/Contents/MacOS/pcbasic -t
load "zeeslag"
list 20160
(use the cursor keys to go to the 400 value, change it, press enter)
list 20560
(use the cursor keys to go to the 1000 value, change it, press enter)
run
So it can be done. That should be enough nostalgia for a while.
►
In 2009, I started an effort to digitize all my cassette tapes. As my last computer that still has a line in port is facing retirement, I decided to finally finish that project. Perhaps more about this later. Turns out some of these old cassettes have weird things on them, including radio broadcasts that contain computer programs.
▼
In 2009, I started an effort to digitize all my cassette tapes. As my last computer that still has a line in port is facing retirement, I decided to finally finish that project. Perhaps more about this later. Turns out some of these old cassettes have weird things on them, including radio broadcasts that contain computer programs.
Back in the 1980s, home computers didn’t come with any storage. But a Commodore 64 floppy drive cost the same as the computer itself here in Europe. So it was common to use a cheap cassette drive to store programs and data. You could of course buy commercial software and/or exchange copies with friends. But without a modem, which didn’t appear until around 1990, there was no good way to exchange data with larger likeminded groups. It also didn’t help that there were many different home computers and they were all different.
Both these problems were addressed with BASICODE. This was a lowest common denominator subset of the BASIC programming language that all home computers came with. For the essential functions missing from the common BASIC subset, BASICODE provided a set of standardized subroutines. So if you wanted to run BASICODE programs, all you had to do is make the subroutines for clearing the screen, setting the cursor position, et cetera for your specific computer, and you could run all BASICODE programs.
But you still had to get these programs. Solution: a standardized cassette data format. BASICODE for a certain computer model would typically be able to read and write BASICODE from/to cassette. There were also little adapters that plugged into the serial port. These days, we can use the program minimodem to decode BASICODE from WAV files. I pulled pvtmert/minimodem from Docker and used the following to decode my recordings:
(The tr command strips the high bit that is set to 1 in the BASICODE protocol.)
In addition to broadcasting a selection of various user-submitted BASICODE programs, the radio program Hobbyscoop also broadcast a weekly newsletter in the form of a BASICODE program. Turns out I have a recording of the 250th one, from 1989. This is how the program came off of the tape.
After cleaning it up wanted to run the program on my C64 Mini, but for the life of me I couldn’t find the C64 BASICODE subroutines. So I made my own. Get the .prg file here if you want to try for yourself. Or try it here with a Javascript BASICODE interpreter.
Then I decided to see how hard it would be to make BASICODE run in Python. Have a look at the resulting code here. Turns out that with some small syntax differences a lot of BASIC statements are basically the same in Python, and it’s easy enough to implement most missing ones with a few lines of code. The big difference is that in BASIC, you mostly need to structure programs yourself using GOTO statements, while modern languages like Python are much more structured and don’t have GOTO. Also, in BASIC all variables are global. So the porting is easy, but not entirely trivial.
The hardest part was getting reading the cursor position to work properly. In xterm you do this by sending an ANSI escape sequence to the terminal, and then you get one back that you read from standard in. Strangely, this was also the hardest part on the Commodore 64, where I eventually had to call a KERNAL (system) routine to do this.
►
There's an episode of the TV show Friends where Chrissie Hynde has a guest role. Phoebe feels threatened by her guitar playing, and asks her "how many chords do you know?" "All of them."
Wouldn't it be cool if you could give the same answer when someone asks "how many programming languages do you know?"
But maybe that's a bit ambitious. So if you have to choose, which program language or programming languages do you learn?
▼
There's an episode of the TV show Friends where Chrissie Hynde has a guest role. Phoebe feels threatened by her guitar playing, and asks her "how many chords do you know?" "All of them."
Wouldn't it be cool if you could give the same answer when someone asks "how many programming languages do you know?"
But maybe that's a bit ambitious. So if you have to choose, which program language or programming languages do you learn? I got started with BASIC, 6502 assembly, Forth and Pascal. Those are now all obsolete and/or too niche. These are other languages that I'm familiar with that are relevant today:
C
Javascript
Python
PHP
I'd say that either C or Javascript is the best choice as a first language to learn. Javascript has the advantage that you only need a web browser and a text editor to get started, and you can start doing fun and useful things immediately. However, Javascript's object orientation and heavy use of events makes it hard to fully understand for someone new to programming. So it's probably better to dip a toe in with Javascript and after a while start learning another language to get a better grasp of the more advanced fundamentals.
C is the opposite of Javascript. It's certainly not very beginner friendly, not in the least because it requires a compiler and a fair bit of setup before you can start doing anything. And then you get programs that run from the command line. It's much harder to do something fun or useful in C. However, what's great about C is that it's relatively simple and low level, which means that it's an excellent way to learn more about the way computers and data structures actually work. Because it's a simple language, it's a reasonable goal to learn the entire language. That's especially important when reading other people's code. Also, many other languages such as Java, Javascript and PHP are heavily influenced by C, so knowing C will help you understand other languages better.
If you want to be able to be productive as a programmer and you could only use one language, Python is probably the one. It's used for many different things and has some really nice features to help you start going quickly. But it also has many of its own quirks and complexity hides just below the surface, so like with Javascript, I would use Python as a "dipping your toe in" language and if you want to learn more, switch to something else. A big advantage of Python over C is that you don't need a compiler, but it still (mostly) lives on the command line.
PHP is the language that I've used the most over the last 20+ years. If that hadn't been the case, I'm not sure it would have been on this list. It's not held in very high regard in many circles, so if you want something that looks good on your CV, PHP is not a top choice. Then again, it works very well for web backends, and has an incredible amount of stuff built in, allowing you to be productive quickly. It's also close to C in many ways, so that helps if you already know C. But like Javascript and Python it's a dynamic language, so it takes a lot less work to get things done than in C.
Of course a lot depends on what you want to do. For stuff running in a browser, Javascript is the only choice. For low level stuff, C is the best choice, although Python could work in some cases, too. I think for web backends, PHP is the best fit, but Python can certainly also do that. For developing mobile apps, you need Swift or Objective C. For Android, Java or Kotlin. Mac apps are also generally in Objective C, with Swift (Apple's relatively new language) becoming more common. On Windows, a lot of stuff is written in C#. A lot of lower-level stuff, especially graphics, is done in C++. So these are all very useful languages, but I wouldn't recommend any of them as a first language.
So let's have a look at a simple program in each of those languages, and then see how fast they run that same program. For a given input, the program calculates what numbers that input is divisible by. (It's not optimized in any way and there is no error checking, so it's not an example of good code.)
C:
#include <stdio.h>
#include <stdlib.h>
int main(int argc, char *argv[])
{
int n, i;
n = atoi(argv[1]);
if (n % 2 == 0)
printf("%d\n", 2);
for (i = 3; i < n; i += 2)
if (n % i == 0)
printf("%d\n", i);
}
PHP:
<?php
$n = $argv[1];
if ($n % 2 == 0)
printf("%d\n", 2);
for ($i = 3; $i < $n; $i += 2)
if ($n % $i == 0)
printf("%d\n", $i);
Javascript:
<script lanugage=javascript>
var ts1, ts2, n, i;
ts1 = new(Date);
n = 444666777;
if (n % 2 == 0)
document.write(2 + "<br>\n");
for (i = 3; i < n; i += 2)
if (n % i == 0)
document.write(i + "<br>\n");
ts2 = new(Date);
document.write("Time: " +
(ts2.getTime() - ts1.getTime()) / 1000 +
" seconds<br>\n");
</script>
Python:
import sys
import math
n = int(sys.argv[1])
if (n % 2 == 0):
print(2)
for i in range(3, n, 2):
if (n % i == 0):
print(i)
(For the Javascript version I hardcoded 444666777 as the input, for the others the input is read from the command line.)
Common wisdom is that compiled languages like C are faster than interpreted languages (the others). That turns out to be true, with the C version (compiled with -O3 optimizations) taking 0.7 seconds on my 2013 MacBook Pro with a 2.4 GHz Intel i5 CPU.
But interestingly, the Javascript is barely any slower at just over 1 second. This shows just how much effort the browser makers have poured into making Javascript faster.
The PHP version, on the other hand, takes more than 21 seconds. The Python version 50 seconds. Weirdly, 15 of those seconds were spent running system code. This is because running the Python program uses up 6 GB of memory on my 8 GB system, so the system has to do all kinds of things to make that work.
It turns out that having a for loop with the range function is problematic. It looks like range first creates the requested range of numbers in memory (all 222 million of them!) and then the for loop goes through them. But we can replace the for loop with a while loop:
import sys
import math
n = int(sys.argv[1])
if (n % 2 == 0):
print(2)
i = 3
while (i < n):
if (n % i == 0):
print(i)
i = i + 2;
This does the same thing, but in a way that's more like the for loops in the other languages. This version takes 36 seconds, and, more importantly, there are no issues with memory use.
C can do these calculations really fast because the overhead of pushing many small instructions to the CPU is small. Each instruction has more overhead in the other languages. With more complex operations, such as manipulating text strings, the C's advantage is a lot less because each operation in the program leads to a much larger number of instructions for the CPU, so the language overhead is a much smaller part of the running time. I haven't been able to think of a nice simple test program to see how big the difference is, though.