Monday, January 02, 2006
Software Transactional Memory, Part I: Conception
Software Transactional Memory, Part I: Conception: "It all started with the Collections API, but then so many things have. The classic performance tuning target is algorithm misuse. When I was a young programmer, I saw a lot of software that was genuinely suffering from this problem. Frequently, the cause was selection of the wrong collection type, or the use of poor tuning parameters. At the time I had two favorites. Let's start with the simplest one. Basic algorithm selection. I'd find code that was doing collision detection (contains()) against a large Vector. Sometimes I'd find that the list wasn't even ordered, so modifying it to use a Hashtable was a trivial change. The other kind generally took longer to arise, appearing in more mature code. I'd see a program that was using too much memory, or taking too long to initialize a large object graph, or both. Upon profiling the code, I'd discover that the culprit was code like this: Vector children = new Vector(6); I'd see that children typically contained 7 or 8 elements. That meant that for every node in the graph, there was both wasted cycles on resizing the collection, and wasted memory on empty elements. The irony of code like this was that the original author would have been better off having done nothing at all. The default size of 10 would have avoided the resize operation, "