Κυριακή, Νοεμβρίου 21, 2004

Core dump

You are propably wondering what "core dump" means. This is a term used in the old "unix"-type systems to denote a special file, called "core", that was written immediately before a forceful abnormal termination of a program. That file would contain the memory (core) contents of the program before its termination, thereby providing a snapshot of what the program was doing when it crashed.

In many modern unix systems core dumps are by default disabled because end-users have no use for them. A bash shell provides the "ulimit" builtin command that reports the default core file limit. Usually it is set to 0, meaning that core is not dumped when programs fail. You can enable this by setting it to a value large enough to contain the memory contents of your program. Don't forget to periodically check your filesystems for leftover "core" files, possibly with a command like "find / -name core".

The meaningful use of a core file requires a debugger like gdb than can match an executable and its core and produce a human-readable view of the the program's internal state at the time of failure. This is called "post-mortem analysis". Many modern systems try to do that, including Windows XP, by encouraging the user to submit similar "core dumps" or "crash information" to the developers.

In my case, "core dump" is the act of writing out my thoughts in a free form, guided by an occasional inspiration. I strive to print my thoughts and the questions that I face, just like a snapshot of my mind. (possibly also meant for debugging) Some other people also have weblogs called "core dump" so this was not a unique choice. Still, it is quite accurate.

PKT