The International Obfuscated C Code Contest has a newly revamped web site, and the Judges have announced the 28th contest, to coincide with its 40th anniversary. (Or 41st?)
The Judges have also updated the archive of past winners so that as many of them as possible work on modern systems. Accordingly, I took a look at my 1998 winner to see how much damage time hath wrought.
When it is built, my program needs to go through the C preprocessor twice. There are a few reasons:
-
It’s part of coercing the C compiler into compiling OFL, an obfuscated functional language. OFL has keywords
l
andb
, short forlet
andbe
, so for example the function for constructing a pair is defined asl pair b (BB (B (B K)) C CI)
In a less awful language that might be written
let pair = λx λy λf λg (f x y)
Anyway, the first pass of the C preprocessor turns a
l
(let) declaration into a macro#define pair b (BB (B (B K)) C CI)
And the second pass expands the macros.
(There’s a joke in the README that the OFL compiler has one optimization, function inlining (which is actually implemented by
cpp
macro expansion) but in fact inlining harms the performance of OFL.) -
The smaller the OFL interpreter, the more space there is for the program written in OFL. In the 1998 IOCCC rules,
#define
cost 7 characters, whereasl
cost only one. I think the modern rules don’t count C orcpp
keywords so there’s less reason to use this stupid trick to save space. -
Running the program through
cpp
twice is a horrible abuse of C and therefore just the kind of joke that the IOCCC encourages.(In fact the
Makefile
sends the program throughcpp
three times, twice explicitly and once as part of compiling to machine code. This is deliberately gratuitous, INABIAF.)
There were a couple of ways this silliness caused problems.
-
Modern headers are sensitive to which version of the C standard is in effect, wrt things like
restrict
keywords in standard library function declarations.The extra preprocessor invocations needed to be fixed to use consistent
-std=
options so that the final compilation doesn’t encounter language features from the future. -
Newer
gcc
emits#line
directives around macro expansions. This caused problems for the declarationl ef E(EOF)
which defines
ef
as a primitive value equal to EOF. After preprocessing this became#define ef E( #line 1213 "stdio.h" (-1) #line 69 "fanf.c" )
so the macro definition got truncated.
The fix was to process the
#include
directives in the second preprocessor pass rather than the first.I vaguely remember some indecision when writing the program about whether to
#include
in the first or second pass, in particular whether preprocessing the headers twice would lead to trouble. First pass#include
seemed to work and was shorter so that was what the original submission did.
There’s one further change. The IOCCC Judges are trying to avoid
compiler warnings about nonstandard arguments to main. To save
a few characters, my entry had int main(int c) { ... }
but the
argument c
isn’t used so I just removed it.
The build commands still print “This may take some time to complete”,
because in the 1990s if you tried to compile with optimization you
would have been waiting a long time, if it completed at all. The
revamped Makefile
uses -O3
, which takes gcc
over 30 seconds and
half a gigabyte of RAM. Quite a lot for less than 2.5 KiB of C!