Πέμπτη, Ιουλίου 07, 2005

European software patents rejected (for now)




This is hard to believe. I'm pleased with the outcome of the vote on software patents. With an absolute majority of 648 votes against, this is the first time a legislation proposal is rejected at second reading in the European Parliament.

This outcome does not finally resolve the issue of software patentability, nor does it prevent a similar piece of legislation from appearing in the European parliament after some time. Clearly, a solution must be found eventually, for the mutual benefit of inventors and the general public.

However, this vote has proven that people are indeed able to affect major political decisions (without resorting to violence or a revolution, of course!) even in a huge multi-national bureaucratic organism like the EU. It's a significant victory because the public has been educated and our representatives, hopefully, now understand that any future attempts to approach this issue must be very carefully considered and discussed.

For a brief news report, you may visit EUobserver.

PKT

P.S. I'm curious to see the reaction of R.M Stallman. Here is some more information from the Free Software Foundation (Europe).

Κυριακή, Ιουλίου 03, 2005

Fun with the Linux cpufreq driver!





Have you ever used a computer with the (famous) turbo button? Some PCs, usually XT clones with a 10MHz 8088 processor (16-bit core, 8-bit bus), included a turbo button that changed speed from the "compatible" 4.77 MHz to the "fast" 10MHz. Although practically useless, since it was usually stuck to "high", its existence was nevertheless quite intriguing.

Now you can have the same kind of fun with your desktop processor by changing its speed at will. All you need is a recent Linux kernel and a supported processor. Version 2.6 should do fine, although only 2.6.10 and later worked for my Asus K8V Bios 1.007b + Athlon 64. Supported processors are (as of version 2.6.12.1) :

--- CPUFreq processor drivers
  • ACPI Processor P-States driver (generic, may work)
  • AMD Mobile K6-2/K6-3 PowerNow! (mobile, rare cpu)
  • AMD Mobile Athlon/Duron PowerNow! (mobile)
  • AMD Opteron/Athlon64 PowerNow!
  • Cyrix MediaGX/NatSemi Geode Suspend Modulation
  • Intel Enhanced SpeedStep
  • Intel Speedstep on ICH-M chipsets (ioport interface)
  • Intel SpeedStep on 440BX/ZX/MX chipsets (SMI interface)
  • Intel Pentium 4 clock modulation
  • nVidia nForce2 FSB changing
  • Transmeta LongRun
  • VIA Cyrix III Longhaul
You need to do the following:
  1. Enable a relevant BIOS option, if necessary. In my case, I had to enable Cool'n'Quiet for the Athlon 64. Some older BIOSes may be buggy, but check before doing an unnecessary BIOS upgrade.
  2. Enable the "cpufreq" driver under "Power Management" in the Linux kernel configuration.
  3. Choose your CPU from the list of drivers
  4. Choose a "cpu frequency governor" that tells the kernel when to change frequency.
  5. Recompile and boot
  6. Go to /sys/devices/system/cpu/cpu0/cpufreq and try the various options.
The available cpu frequency governors, as of 2.6.12.1 are :

Default CPUFreq governor (performance)
  • --- 'performance' governor : fastest speed and highest power, unless you manually override it
  • <*> 'powersave' governor: slowest speed and lowest power, unless you manually override it
  • <*> 'userspace' governor for userspace frequency scaling: a user-space program (daemon) uses more complicated "rules" and "policies" to ensure optimal behavior. This is a complex solution that is most appropriate for notebooks or when very precise control of frequency scaling is needed.
  • <*> 'ondemand' cpufreq policy governor : tries to use the slowest speed as much as possible, but quickly switches up or down when needed
  • <*> 'conservative' cpufreq governor : tries to use the slowest speed as much as possible and is more reluctant to change speeds (either from high to low, or the opposite).
Note that I have checked (included in the kernel) all governors. You can choose any of them at run time and it doesn't hurt to have them around.

If you have done all the above correctly then after reboot you should see something like (may vary depending on the actual processor, of course):
powernow-k8: Found 1 AMD Athlon 64 / Opteron processors (version 1.00.09e)
powernow-k8: 0 : fid 0xe (2200 MHz), vid 0x2 (1500 mV)
powernow-k8: 1 : fid 0xc (2000 MHz), vid 0x6 (1400 mV)
powernow-k8: 2 : fid 0xa (1800 MHz), vid 0xa (1300 mV)
powernow-k8: 3 : fid 0x2 (1000 MHz), vid 0x12 (1100 mV)
cpu_init done, current fid 0xe, vid 0x2

This message appears after the network initialization and before the ACPI initialization on my computer. You may review kernel boot messages with "dmesg | less".

Here we see that the processor is currently at 2200MHz (0xe fid) and 1500mV (0x2 vid).

We should now see a directory : "/sys/devices/system/cpu/cpu0/cpufreq/" (similarly for cpu1, cpu2 etc)

Simple commands:
cd /sys/devices/system/cpu/cpu0/cpufreq/
cat scaling_available_governors # lists available governors
echo powersave >scaling_governor # set minimum speed
OR
echo performance >scaling_governor # set maximum speed

You may also try:
cat scaling_max_freq # show max speed
cat scaling_min_freq # show minimum speed
cat scaling_cur_freq # show current speed

Minimum and maximum can also be manually set (within the preset cpu limits, of course) so that speed never goes above or below a certain point.

For example:
echo 1800000 >scaling_max_freq # Maximum speed that we want to use is 1.8GHz
echo 1500000 >scaling_min_freq # Minimum speed that we want to use is 1.5GHz

The frequency must be in KHz. You cannot write "echo 3.2 >scaling_max_freq". You must use "echo 3200000>scaling_max_freq" instead.

Now try "cat /proc/cpuinfo" and see your current speed!!

If you want automatic speed changing you can try the "ondemand" governor. It works quite well, but for some cpus, especially Athlon64, the "conservative" governor might be better. In my experience, the conservative governor hesitates a lot before changing speeds.

As an example, right now my processor "idles" at 1000Mhz, 1.1V. This lowers power consumption significantly. If I want to play a game or compress some music I can easily increase its performance or let the "ondemand" governor do it automatically. This is a great trick, and everyone should try it!!

Have fun...

Note: this does not constitute overclocking and should not pose a danger to your cpu in any way. Appropriate frequency switching is a great way to reduce heat dissipation and power consumption when you are doing tasks that are not cpu-intensive. That being said, however, I do not assume any responsibility if you decide to try this.


PKT

Σάββατο, Ιουλίου 02, 2005

TFT + CRT desktop




This has been debated a million times. Everyone (and their dog) knows the absolute truth about TFT and CRT and has a very clear opinion that can usually be summarized as "TFT is teh r0x0rz" or "TFT is teh sux0rz".

Since I recently bought a decent 19" TFT I have been amazed by the significant differences that I discovered between these two technologies in various tasks. Take my opinion for what it is, i.e. an opinion, not a bullet-proof double-blind experiment

For the record, the CRT is an excellent flat Nokia 730C with 96KHz sync. It displays 1600x1200 but I usually prefer 1152x864@100Hz. The TFT is an IIyama Prolite E485S 19" (very few reviews of that one in the internet...) that displays 1280x1024@75Hz. It has a DVI connector and it is quite cheap at 390 Euros. For details see this page in ...French (tech stats still readable, of course). It seems that the model has been discontinued and is no longer shown in the UK or US pages. I drive both with a Radeon 9800 Pro.

I'll make my observations into a short list:
  • The TFT fixed resolution is not as restrictive as I thought. Even in non-native modes, like 800x600, the quality is acceptable for games. I assumed that I'd have to use 1280x1024 for everything but that is not the case.
  • This monitor has a 23ms (nominal) response time but in practice, even in very fast games like Quake 3 or UT2004 I was not bothered much. I could use a snappier response, but I had no trouble playing Q3 at "hardcore" levels. For strategy, RPG and other less intensive games the screen is fine.
  • The refresh rate does matter, although you don't need 120Hz to get a steady image with a TFT. 60Hz does flicker a little on the TFT and I routinely choose 75Hz.
  • The analog output is obviously degraded when I use both screens at the same time. This makes the CRT look hazy. I suppose there is a point in buying "high-end" (as in Matrox, for example) graphics cards for multi-CRT work.
  • I did not see a huge difference moving my TFT from analog to DVI. It's probably better but not by much. Your mileage may vary depending on the quality of your graphics card (DAC) and the quality of the TFT's digitization circuits (ADC).
  • The TFT color is simply not as good as the CRT. The Nokia has a very good color performance and side-to-side comparison running two copies of the same movie reveals huge differences. I used to think that the color performance of the TFT was really good, until I saw the same movie scene side by side. The only way to get a good color range with the TFT is to use insane levels of brightness. When both monitors are adjusted to the same brightness levels that don't make my eyes hurt the range of the TFT is limited. Most importantly, black is not really ... black. It looks more like dark green. On the other hand, the color is quite homogeneous and the contrast is strong and appropriate for text and office use. I have configured the TFT at 24% brightness and 36% contrast. More expensive TFTs may be better at this, but I cannot afford them right now!!
  • Subpixel antialiasing for fonts is a controversial issue. Some fonts are markedly improved, others are degraded. I suppose that the font design should be taken into account. Admittedly, the Windows fonts are quite good with subpixel antialiasing (Microsoft calls this "ClearType") and most users are right in preferring them. The Linux/X/freetype algorithm is also very good and provides excellent results with some fonts.
  • I have noticed one faulty subpixel in my screen. Luckily, it is at the right boundary and I never see it in ordinary conditions. For those that want a precise test, this page can help (and, no, you don't have to pay anything)!

I'm quite happy with my new monitor, but the TFT technology must certainly be improved. Unfortunately, the response time has become a major selling point, even though the majority of users really won't notice the difference between 25ms and 8ms for everyday tasks. I'd rather have excellent color instead. From a marketing perspective, excellent color is not something you can easily quantify, while the 8 vs 12 vs 16 ms response time is something that even the most naive user can grasp.

PKT

Τρίτη, Ιουνίου 28, 2005

Oh, the irony... Creationism ads in my blog!





I just noticed that Google displays advertisements which promote the "Intelligent Design" concept in my sidebar. Actually, I did put the Adsense code in the sidebar, but this was not what I anticipated. I guess that my recent (?) post regarding Evolution was relevant enough.

Well, I certainly don't expect any of my readers to be clicking any of these ads, so I'll probably remain a poor Ph.D. candidate. I think you should click them and see why "Intelligent Design" is an unbelievably bad idea. I can assure you that the sites linked to them are genuine fun for anyone with a basic understanding of the scientific method.

I had briefly quoted Karl Popper in my aforementioned post and I need to repeat his proposition that any existential statement ("there exists a X that does Y") is verifiable but not falsifiable and therefore is clearly metaphysical (it cannot be corroborated by experience). As such, any statement of the form "evolution was done by X" is clearly not scientific. It may be true, but we may possibly never know. And it isn't science.

I spent quite some time reading Popper's excellent "The logic of scientific discovery since my last post. I picked it up in Edinburgh, where I was visiting a friend (greetz to my 733t friend hax0r, vvas). It illuminates the methodology of science in a way that makes Popper's propositions immediately compelling. It's one of these books that seems obvious once you understand what it says. Truly, the work of a genius.

I'll spend the following few minutes disproving one of the ridiculous claims that I read in one of the advertised "intelligent design" sites. These people claim that natural selection fails to explain the concept of "return to normalcy", according to which talented people may not breed to even more talented children, although inversely, average people may give birth to talented children. First error: Natural selection can not be disproved by individual examples. Natural selection explains the evolution of a species, not the evolution inside a specific genealogical tree. Second error: the concept of "return to normalcy" is a probabilistic concept. The selection of paternal and maternal genes is a random process and a child usually does not get all the good genes that his father or mother have to offer. As a result, a child usually falls in between his parents for most quantitative attributes. As a crude example, if father has Genius_gene_A and Average_gene_B and mother has Genius_gene_C and Average_gene_D, 25% of their offspring will be pure genius, 50% will be like their parents (1 Genius and 1 average gene) and 25% will be pure average. The expansion of these "combinations" when more genes are implicated gives rise to the so-called binomial distribution. On average, most children will be similar to their parents and usually between them. Out of sheer luck (?), gifted children can be born from average parents, without the need to invoke "divine intervention". Similarly, gifted parents may have average children.

PKT

P.S. I just noticed that the creationism ads are gone! Oh, well...

Δευτέρα, Ιουνίου 27, 2005

Stopping coffee







I usually don't wish to talk about myself in this page. I prefer to present things that I consider interesting. In this case, however, I'll make a small exception, in the hope that my experiences will be of value to some of you.

During the last year I increased my caffeine intake considerably. I was serving in the army and I had to wake up at 05:50 - 06:30 or stay up during the night. Caffeine did help. Besides the actual pharmacological effect--which may gradually diminish--the enjoyment of small pleasures, like coffee, is sometimes psychologically indispensable. Now that I enjoy considerable working freedom (being a Ph.D candidate can be beneficial for your sleep habits, especially if you prefer staying up late instead of waking up early) I decided to gradually reduce my caffeine consumption.

(you may safely skip the following 4 paragraphs if you have no interest whatsoever in the biological properties of caffeine and similar molecules)

The caffeine molecule is a methylated purine that bears some relation to DNA bases and energy storage molecules like ATP. The broad class of methylated xanthines (kind of purine) includes several variants that can be found in natural (and not-so-natural) sources.

It is a lesser known, but true, fact that caffeine renders cells sensitive to DNA damage, especially from ionizing radiation (see here and here, for example). This is usually of interest in cancer radiotherapy, but the effect also occurs in normal cells. This raises the question of whether caffeine increases the susceptibility of normal cells to DNA damage, mutations and cancer. Fortunately, there is absolutely no indication to support this idea. Surprisingly, some studies even indicate that caffeine may decrease cancer risk in healthy individuals.

On the other hand, increased caffeine intake does increase fat metabolism and reduce appetite and may help during weight loss. It also improves athletic performance, which is why it is a controlled substance (high levels in athletes are considered "doping"). The relevant bibliography is very, very confusing because many "weight-loss" drug companies sponsor studies of dubious quality that get published in low-impact journals (impact factor is a rough measure of the credibility of a journal, as evidenced by the number of publications that refer to this journal). As a general rule, increased caffeine intake cannot be routinely recommended as a way to promote weight loss because of significant side-effects, although it might work for some people. The exact mechanism is not entirely clear at this moment.

Finally, another, significant effect of caffeine ingestion is the induction of anxiety. Caffeine consumption over 200mg/d can be a major cause of anxiety for some people and should be evaluated in all people that present with symptoms related to anxiety.

All the above being said, I must admit that I enjoy a good cup of coffee and I don't consider this to be particularly bad for health (in moderation, of course). I usually drink high quality espresso that I prepare at home with a nice Krups espresso machine. Its caffeine content, when prepared with Arabica coffee is quite low and the flavor is excellent. As a side note, I don't like complicated hot-drinks based on coffee (double extra latte frappuccino with whipped cream is nice, but it isn't coffee by my definition.)

Anyway, I decided to reduce my daily double espresso ritual to 2-3 espresso shots per week. At first I had a constant headache, especially at the back of my head. This was quite annoying but not strong enough to make me want an aspirin or paracetamol. The other weird effect was the deregulation of my sleep hours. I didn't sleep more, I just slept at odd hours. I could fall asleep at 16:00, wake up at 20:30 and then sleep again at 05:00. It seems that my insistence on drinking coffee strictly in the morning had conditioned my brain to accept this as "wake-up time".

A few days later, my headache is gone and my sleep is generally back to schedule. Most importantly, I can get up almost at any time with very little difficulty. Waking up is much more painless now that I don't need coffee to get me going. I simply open my eyes, stand up and I am completely awake and ready. My athletic performance has dropped a little bit (~5-8%) judging by my average cycling speed, but it's hard to provide an accurate estimate because the wind and the ambient temperature vary a lot these days. Not that it matters much, since I don't really compete with anyone. (and, of course, I can always drink coffee if I feel the need for speed).

PKT

European software patents







It should come as no surprise that the process for the adoption of US-style software patents in Europe is progressing. There is considerable lobbying from major software companies to that effect.

I am not a lawyer and I cannot provide a reliable analysis of the proposed framework. However, I do know that several open source and free software efforts are threatened by patent laws. This is an important problem.

My experience in the Biomedical sciences shows that in the long term the zealous patent holders only serve to aggravate the problem by inflating the costs of research. The end result is that every respectable lab or university tries to acquire patents in order to economically survive and afford the (patent-encumbered) expensive research materials. Repeat this process and see where it leads...

In my mind, the advancement of human knowledge and the benefit of society at large take precedence over individual or corporate rights and benefits. I'd rather have some companies (or even individuals) temporarily deprived of their rightful privilege to patent intellectual property than allow the premature adoption of an ill-formed directive that puts science and free software at risk. Patent holders can wait until a good solution has been found.

For that reason, I think that the current EU patent proposition should be re-evaluated. I understand that the EU does need a central patent mechanism. I understand that the current system is inadequate. However, I'm not convinced that the proposed directive is a satisfactory solution.

I urge my EU readers (not that I have many readers, but anyway...) to consider the matter carefully and act accordingly.

Many details can be found in the excellent Europa site. This page contains lots of related information.

Also click this link for some (admittedly biased) information.



PKT

Παρασκευή, Μαΐου 06, 2005

The evolution debate...




In the 7037 issue of Nature (a very prestigious scholarly publication) the front cover is devoted to the growing "intelligent design" movement that has gradually become a significant force in the US. I am deeply concerned that this phenomenon may gradually erode the scientific status of the formal "evolutionary theory" and influence the education of school children and university students.

Let me begin by saying that all serious scientists openly support the theory of natural selection as the most plausible explanation for evolution. As an example, the Nature article clearly quotes Bruce Alberts (president of the National Academy of Sciences, and also famous for his biology textbook that is freely available online here) in saying that "(...the concept of intelligent design) doesn't deserve any attention, because it doesn't make any sense".

This is not my main point, however.

The most important thing to always remember is that religion (and any theory that involves supernatural involvement has a religious basis) and science do not address the same issues. Religion is not about how man came to be, but rather about why man came to be and what principles should guide his behaviour. The issues that religion addresses are ethical and spiritual, not scientific. In that sense, religion operates on a higher level than science and the two should not mix, because they cannot contradict each other if interpreted in their respective context.

Finally, according to the famous philosopher Karl Popper (see here and here), any "theory" that is not falsifiable (i.e. can never be proven to be false) does not constitue science. Since it is completely impossible to prove that supernatural intervention never took place, the "intelligent design" theory does not constitute science (curiously enough, the proposition that aliens from Alpha Centauri guided the evolution of life on earth is a scientifically acceptable proposition, even though it is almost certainly bogus!).

PKT

Κυριακή, Μαΐου 01, 2005

Happy easter, by the way...

Today is orthodox Easter, so "Χριστός Ανέστη" to all of you.

What about the bike????

The bicycle (Ideal) I recently bought is a mountain-road hybrid. It has a nice front suspension, a comfortable straight handlebar and somewhat thicker tires (700C35, which is thicker than 700C25 or smaller road tires but larger than mountain tires). It can take some physical abuse without problem and is (appears?) reliable and strong. It is heavier than my previous bike (road) but the difference is not staggering and the price is right.

I don't see much purpose in riding mountain bikes in the city and on roads (much heavier, considerable rolling resistance) but I could not live with a pure road bike. Having the extra versatility of a hybrid is frequently worth the performance hit.

PKT

On bicycles and the perception of time...




I recently (March 30, 2005) bought a new bicycle.

Relying on your own effort and strength to transport yourself makes the riding experience real. The wind, the small hills, the turns, the surroundings all suddenly become very significant. You can feel the wind in your face, your leg muscles burn and your hands strain from the effort.

There is something very primitive and powerful that stems from these sensations. While theoretically one can be attentive and alert while cruising in a luxurious car, in practice the sensory insulation that we frequently perceive as comfort drastically lessens the emotional impact of our daily rituals, including transportation.

"A la recherche du temps perdu" is a famous (but very rarely read) novel by Marcel Proust that speaks, indirectly, about a search for "lost time". Time that went by while we were not paying attention to the details of our lives, if I may grossly oversimplify a multi-thousand-page novel. In a similar spirit, Thomas Mann, in his novel "The magic mountain", claims that while routine and boredom can make an hour seem like an eternity, they can easily consume several years without leaving an aftertaste, a recollection of significant events. In the end, memory is the only meaningful measure of time.

A bicycle is not merely kid's play or a fascinating sport. It symbolizes our desire to participate, to cherish moments of life. Speed and comfort is the rythm of oblivion because it precludes attention and memory, if I may paraphrase Milan Kundera ("La lenteur", "slowness").

PKT

Τετάρτη, Ιανουαρίου 12, 2005

Predictions for 2015 (continued)

I skipped predictions on biology and medicine. Here they are (2015 is still quite far ahead, so I'm not that late!). Read at the bottom for a short explanation of some terms.

Biology
The function and importance of many more genes will be elucidated by 2015. This is just the linear projection of a current trend. Much has to be done when it comes to gene polymorphisms--the variations between genes that are almost identical. Several common human diseases like hypertension or dyslipidaemia or diabetes are multifactorial and a clear genetic component can be hard to identify. However, polymorphic gene variants may confer slight susceptibility and their accumulation leads to a substantial risk, even though each of them is insignificant clinically. Understanding the impact of various polymorphisms can be very hard. The genome has ~30-40 thousand genes, according to most experts, but many of those can have several polymorphisms and their differences in function can be very slight.

Artificial biological organisms, even complex ones, will definitely be created. They might even become the "machines of the future". Researchers are already capable of building virii from scratch by assembling components of the viral genome. Microorganisms that do a specific task will be highly efficient and may be used in the industry. We are already making crude use of "genetic engineering" for several purposes but the fine tuning that will become possible will multiply the applicability of these methods. I am, of course, worried about the prospect of ultra-efficient biological weapons but progress is never withouth danger.

Stem cells are definitely an ultra hot issue right know. A lengthier discussion is warranted at some later date. Roughly speaking, think of easy tissue transplantation (new kidneys, heart, lungs) and even the structural remodelling of destroyed tissue (in situ repair of damaged tissue, e.g. brain or spinal cord). It is early to say whether stem cells will deliver on their promises, but if they do 2015 and later will be a very very interesting place to live.

Medicine

Virii

The significance of viral infections is--in my opinion--understated. The majority of the population suffers from repeated viral infections (flu, common cold, gastroenteritis, infectious mononucleosis, childhood diseases). Even though bacterial infections can be noisy and produce a fulminant clinical picture, a huge percentage of everyday infections are usually attributed to viruses and much less information is available. Several viruses are able to produce oncogenic transformation (i.e. cause tumors), others turn to a chronic latent state from which they occasionally emerge (e.g. herpes simplex, varicella-zoster virus and many others) and new ones are found frequently. A complex symbiotic relationship between viruses, humans and several other species is now beggining to be explored (e.g. SARS attributed to an animal reservoir). Our understanding of viral infections may illuminate several non-infectious diseases that have a genetic and an unknown environmental component (i.e. some people are vulnerable but not all vulnerable people get the disease).

I won't offer a specific prediction on AIDS, but it a lot of research is happening and the stakes (=money for pharmaceutical industries) are now VERY high. The progress in this field has been extraordinary, for a disease that was discovered just over 20 years ago.

Cancer
Main advances in cancer will come from cancer prevention and early cancer detection. Cancer is a very complex disease and the therapies we possess are relatively crude. There can never be a universal drug (or class of drugs, or combination of drugs) that will cure all cancers. Instead, we will get an ever increasing knowledge on several cancer sub-types and their associated optimal drug regimens which will slowly but steadily lengthen the life expectancy for cancer patients. At some point we might even be able to discern specific cancer "genotypes" (i.e. analyze the gene patterns in that specific malignant tissue from a specific patient) and choose an optimal matching therapy for that one patient. This will cost a real lot of money because the components of that therapy will have to be tailored to the patient, like a hand-made suit. Cancer detection will improve and instead of curing stage IV patients--which is very hard--we might simply be able to readily detect curable stage I disease (think Pap smear).

Hot research
The most popular fields of medical/biologic research appear to be--in no particular order: alzheimer's disease, AIDS and cancer. You should expect to see major advances in those fields. Cardiovascular diseases are now a relatively mature field and almost optimal prevention strategies are available for patients and the general population (their actual application can be very hard, judging for example by the percentage of obese people in western countries). The drug pipeline for non-terminal diseases can be quite long (for safety testing etc) and a 10-year period won't allow us to witness tremendous improvements in these diseases. Instead I expect stepwise steady improvements and perhaps a few spectacular discoveries.

Anyway, this is just my view of things and since there are many things I don't know, there are even more things I couldn't possibly predict, so prepare to be surprised. I apologize for errors and omissions, this is something I wrote in an afternoon, not a doctoral thesis!

Short explanation of terms:
  • wild-type gene: found in a normal population, considered "normal"
  • mutant gene: a gene with an altered sequence (compared to the wild-type) that does not appear frequently in the population and is (usually?) associated with a disease or abnormal state
  • gene locus: the actual position on the DNA strand where the gene resides
  • gene polymorphism : a group of similar but not identical wild-type genes that are not apparently associated with some disease and are found frequently in a normal population. Their small differences may explain the differences between organisms, like hair color, stature or intelligence.
  • idiopathic diseases: those that do not have an identifiable cause (idion=self, therefore idiopathic roughly equates to self-inflicted)

Τρίτη, Δεκεμβρίου 28, 2004

Predictions for 2015

I am amazed by the amount of scientific and technical progress. It is becoming increasingly hard to predict the future of things to come. People thought that by 2001 we would be roaming the galaxies (which might have happened if space travel had remained the top priority of the civilized world). The most important aspect is not the difficulty of the problem, but the amount of resources allocated to its solution. In that sense, the most important determinant of future technologies and discoveries is the perceived importance of a problem, which of course changes over time. A few years ago nobody cared about space exploration, thus progress in that sector had declined. This emphasizes the importance of market and sociopolitical trends that shape the priorites of research. Scientists enjoy cool projects (e.g. prion research was a very hot topic even though many, many more people die from malaria than from Creutzfeld-Jakobs disease) and there is such a thing as research "fashion".

Space technology
Interest in space exploration has heated up considerably for several reasons. Systems like GPS are the cornerstone of modern military operations, telecommunications (including Internet, TV, phones) really need satellite links and of course space progress has once again become a matter of prestige and competition between China, EU and US. Even private parties have started designing and funding space-related projects. Most likely we will see some significant progress in the years to come. Several new technologies are being tried and I really anticipate a significant breakthrough until 2015 (manned mission to Mars? new space station? new lunar landing?).

Computer technology
Many people enjoy quoting "Moore's law" (observation or proposition would be a more accurate term) that roughly states that microchip progress is exponential. Impressively enough this has held for many years, back from my Amstrad 6128 (Z-80 processor) to todays monsters like the Athlon64. An even faster progress has occured in the video card arena. Will this progress hold? It depends on several factors. Until now the software market has not failed to eat up all available computing power, therefore increasing demand for new hardware.

Games and entertainment have a long way to go until a really life-like experience is available to the consumer, so this is obviously an area where more innovation will bring higher hardware requirements. The gaming experience will likely be the main driving force in the years to come, with massive multiplayer online games being the most important trend. Single player games may even become complex enough to replace the cinematic movies. Already we see many movies being accompanied by the launch of related video games (e.g. Spiderman, Lord of the Rings and many others). Distributable media will become much less important because network speeds increase much faster than removable media capacities. Until a new media standard gains enough market penetration to make it usable, network speeds often render it obsolete (e.g. CD or even DVD size is now practically downloadable as demonstrated by the half life 2 network distribution). On top of this, it is much cheaper to sell game or music downloads than to sell game boxes. (anyone remember the first games of the ultima series? Ultima III included a nice cloth map, a 100+ page manual a wearable yankh amulet and many 5.25" disks. Most modern games include 2-3 CDs and a leaflet! Obviously we're getting less and less in terms of physical game content).

Artificial intelligence applications are also gaining momentum (the once fashionable "artificial intelligence" term is no longer used, but OCR, e-mail classification, voice recognition and web searches often fall in the AI domain). The most obvious use for AI is making sense out of an ocean of information (web, email) and automating time consuming tasks. Will speech recognition really make it? Probably, but newer generations are already comfortable with keyboards so there won't be a huge demand for a speech-centered user interface, which would require considerable design changes.

Multimedia uses are unlikely to generate a much higher demand for computing power in the long range. Encoding and decoding video is a relatively feasible task even for very high resolutions that surpass PAL/NTSC requirements. Audio is already at multichannel 96Khz/24b for mainstream audio cards and it is highly unlikely that more channels (!) or higher audio resolution will be required. Sure some people will always dream the ultimater audio/video experience but as proven by the CD and most importantly the MP3, ease-of-use and availability are always more important than technical merit. The DVD succeeded because it is practical and clearly superior to VHS. The Blu-Ray/HD-DVD and DVD-A/SACD technologies will have a very hard time and might fail despite being pushed really hard by the content companies for their DRM capabilities. These technologies will be replaced, in the long run, by some network access technology, just like games. Already iTunes and others are quite popular. We might not see movies or music in physical formats by the year 2015.
Physical media may still persist for "collectors" (ultra platinum edition of Star Wars MCXXI and Lord of the Rings mega-extended edition with 256 DVDs including all footage that the director has ever shot in his life).

For the few next years the ultra-hot research topics in computer science will probably (still) be massively parallel algorithms for cluster systems and multi core devices and (will we be so lucky?) quantum computing. I can't really make a useful prediction since I am not a computer scientist.

Energy

A very significant project that has recently generated some controversy is the fusion reactor that will be built in france (ITER). There is heated debate regarding the location of the reactor but one thing is for sure: EU and other countries (e.g. Japan) would surely like to disentangle themselves from fossil fuels. Cars will start burning diesel fuel which will come from renewable sources or a combination of electric/diesel power. Already BMW, for example, makes some very impressive diesel engines and given the fact that diesel engines have a higher theoretical efficiency there is no reason to exclude an increase in their adoption. Energy is a very hot priority and we are really likely to see alternative solutions in a reasonable timeframe.

...to be continued

Τετάρτη, Δεκεμβρίου 08, 2004

CISC to RISC (and back again?)

Everybody knows that x86 chips are CISC chips. An everlasting battle between CISC and RISC fans has started to fade but the question remains, is there anything in common between a modern x86 processor, let's say the Pentium 4 , and the CISC remains of the past? Most readers will agree that modern x86 chips are essentially RISC cores with a CISC frontend and "emulated" extended instructions.

Looking at the x86 instruction set can be fun. Whole sets of instructions appear totally out of context today, for example, the ASCII adjust family (AAA, AAD, AAM, AAS), the decimal adjust family (DAA, DAS), the XLAT instruction and several others that have propably never been generated by a compiler. Consider the XLAT instruction, that translates a byte to another byte by using a lookup table. Using it would have been a shortcut to loading the register, calculating a memory address with it (remember that the 16-bit processors had very limited address calculation facilities, in stark contrast to the 386 and later processors) and loading the resulting value. Today the XLAT instruction is propably emulated and executes much slower than an explicit load-lookup-store sequence of generic RISC-like instructions. As a matter of fact, since the Pentium II many of these instructions are prohibitively slow and nobody noticed. In that sense, the x86 became RISC-like.

On the other hand, the x86 instruction set has grown significantly with a huge number of extensions: MMX, MMXext, 3dnow, 3dnow-ext, SSE, SSE2 and SSE3. Many more, relatively minor additions have also been made (anyone remember cmpxchg8b8?). The point is that all these extensions, especially the SIMD ones, are like the 16-bit remnants of the past in the fact that they address a very specific problem which we find significant today but which may also become trivial in the future or be solved in another way. In that sense, the x86 continues the CISC tradition by providing instructions for very specific tasks. However, things do change. If for example multi-core chips become popular it might be easier to have 16 simple cores than 2 complex ones. A simple core might be hard pressed to compute complex multi-unit instructions like MMX and SSE and prefer simpler RISC-like instructions. In that case a multi-threaded, instead of SIMD, approach may be much faster and the whole class of SIMD may be deprecated.

In the course of many years the x86 architecture has evolved significantly. The CISC heritage will never change but the modern Athlon64 in 64-bit mode is a much better processor not only in terms of performance but also by the quality of its design. The future is going to be a very interesting place.

PKT

Πέμπτη, Δεκεμβρίου 02, 2004

Έρως ανίκατε μάχαν


free statistics

"Love is invincible"
(entry title is in greek, iso-8859-7 encoding)

I recently read a book called "The Rule of Four" by Ian Caldwell and Dustin Thomason. I will not provide a detailed review, since many can be found on the Internet and in appropriate publications. The authors appear promising but I personally dislike this whole archeological-mystery "genre". I believe that the book would have been much better without ancient mysteries and modern crime stories. Simply put, some parts of the book, especially those that deal with university life and the young people's quest for identity and meaning, are occasionally brilliant.

I selected a short quote that particularly impressed me. A truly Kundera-esque analysis of "love is invincible" appears in the text, delivered by the father to his young son, the protagonist. The interpretation of the phrase is melancholic but interesting. Popular understanding of the phrase is that "love can help us overcome all difficulties" or "we can achieve anything for the sake of love". Contrary to these beliefs, the father points to the drawing of Eros[1] (Έρως) holding a sword--instead of the usual bow and arrows--and viciously killing its/his foes. Love/Eros is strong, but is not necessarily an ally. Eros is a powerful force that we cannot resist. The original phrase, drawn from Antigone (Αντιγόνη), a tragedy by Sophocles, provides the context to which the authors are true. Love/Eros is no better than madness because it makes people defy ethics, logic and law.

For those familiar with the original tragedy text this explanation is immediately apparent, but for me this revelation caused a shift in perspective. Far too often we hear that Love/Eros is a force for good, a power that helps us. Love/Eros is a power to which we are subjected and this is--in a way--a tribute to its greatness and its inherently dangerous nature. This is perhaps the most important lesson of youth.

PKT


[1] I need to make a slight distinction between "Eros" and "Love". In Greek, "Eros" is an urge. It possesses a playful but aggressive quality, it is a strong but unreliable force, unlike "love" that is deeper and more permanent. Eros may lead to love. The sentiment between Romeo and Juliet is eros, for example.

Τρίτη, Νοεμβρίου 30, 2004

The FPU stack is your friend.

This is an issue that has always intrigued me. Many people, especially in techno-phile web sites like slashdot.org, seem to dislike the x87 fpu stack. Far too often I notice messages that propose an alternatice register-based model, like the one used for integer calculations and logic. This seems very strange to me for two seperate reasons:
  • The "stack" structure is inherently very appropriate for formula evaluation. As a matter of fact, those familiar with the Reverse Polish Notation, a (mostly) lost art that was once popular with HP scientific calculators, will quickly point out that it is the quickest way to do formula evaluation without parentheses. It is a mathematical fact that all formulas can be evaluated in a stack based model, as long as the stack has enough space. Admittedly, the stack model can be a bit cumbersome at first but this is only a matter of habit and does not make it inferior to a register model.
  • Back in the days where the fpu was an extra component--anyone remember the 387?--floating point instructions where very, very expensive. The co-processor did not have its own bus and it used bandwidth from the main processor (40MHz FSB * 32bit plus overhead for 386DX40). Things have changed a lot since. With the arrival of the Pentium processor, the first x86 processor to include by default a decent fpu (note that the 486 had DX and SX versions, so not every 486 chip had an fpu--plus it wasn't that great), the relative cost of floating point math had plummeted. A significant example is the transition from integer based 3d mathematics, as seen in Doom source code (486 era), to floating point based 3d mathematics, as seen in the Quake source code (Pentium era). The unlucky competitors in that period where Cyrix and AMD, whose "Pentium-class" processors would always get severely beaten in Quake fpu intensive benchmarks, even though they offered good integer performance. A little known fact about the Pentium fpu was that Intel had made the fxch instruction extremely fast by allowing it to execute in parallel with every non-dependent instruction. This essentially means that since the Pentium processor anyone can use the fpu stack like a register set by freely executing the fxch instruction. There it is. Quit whinning and use the fpu however you like.
PKT

P.S. Note that new RPN HP calculators are still being made. These are great tools, but most people think that it propably makes more sense to buy a Palm and throw calculator software on top. Not the same thing, though, for several reasons.

Δευτέρα, Νοεμβρίου 22, 2004

Piano secrets I

This title is a bit misleading. I do not intend to share the ultimate secret to piano playing because such a secret does not exist. However, I do know some minor facts regarding the physical act of playing the piano that may be of potential interest to some of you.

Touch, strength, sound volume and quality

You may wonder how some famous pianists achieve a wonderful clean, majestic sound quality even in loud volumes, without a trace of harshness. This is very mysterious at first because it seems that the only variable that dominates the impact of the hammer on the string is the speed with which a key is pressed. How can the quality of the sound depend on a single variable that, obviously, mostly influences sound volume? The answer lies in the fact that the keyboard is not a "mathematical model" but a real solid mechanical system and a rapid acceleration can induce a temporary elastic deformation of the lever that bears the hammer. Therefore the hammer-lever-axis system carries energy in two ways, as kinetic energy of the hammer movement and as dynamic energy of lever deformation. The dynamic energy of the system induces an oscillation of the hammer relative to the axis (think of a ruler or a long strip of metal being bent and then suddenly released). Now, the tricky part is that this oscillation is not related to the native frequency of the string and upon impact causes the string to oscillate in its native frequency (clean sound) plus the oscillation frequency of the hammer mechanism (not harmonic, "dirty" sound).

In practical terms this means that if the key is accelerated rapidly then the hammer-lever system is deformed and the sound is "aggressive" and "hard" because of the extra transient non-native frequencies of the string upon impact. If the key is accelerated smoothly, but with a similar final velocity, the sound is equally loud but smoother. This is very intuitive, but hard to achieve in practice: if you hit a key with a rigid hand you force the hammer to accelerate suddenly. A hand in tension causes aggressive sound. A relaxed hand causes smooth sound. Doing this requires tremendous amounts of practice because great playing speed (succession of keypresses) and great energy transfer (speed of hammer on string) is very hard to do with a relaxed hand.

The piano itself can aid or hinder the pianist in his effort for clean sound. High end pianos like Steinway or Bechstein have a comfortable mellow sound without using a very heavy action mechanism. Still, in a high end piano the difference between an amateur and talented professional is even more pronounced. This is good, because it clearly reveals bad piano technique. Most people have trouble realising that their sound is "harsh" and "stressful" when playing on low quality pianos.

Medical knowledge tip: relaxing a muscle requires energy consumption. In a cellular level this happens because when a muscle contracts some proteins move to a "tense" state which can only be reverted with further energy consumpion, but at a much lower cost than contracting the muscle. A tired muscle is unable to relax quickly and this is easily seen in electromyography. Conclusion: when you are tense you get easily tired and when you get tired, you get even more tense. This applies especially to very quick succesive muscle contractions, like the ones in piano playing. Try to relax.

PKT