literally unprecedented

Our mans have been trying the recent Windows 7 beta (build 7000) and it appears that it is in fact faster than Windows XP.

No, really. I would love it if this were not true and be completely unsurprised. But I find myself relishing this prospect even more: preliminary indications show that Microsoft actually knows what they’re doing.

I’m trying to keep this conservative, but after the vastly disappointing debacle that was Windows Vista, the concept of a BETTER version of Windows after six or seven years is mind-bending. And very exciting. As inviting as the prospect of the gaming world migrating to Linux might be, it’s a GREAT relief to find that it wasn’t completely hype after all.

I have been a stolid unbeliever for many moons in Microsoft’s ability and willingness to change their dastardly ways and make an operating system that did what it should (i.e., operate the system, not commandeer it). This was all started when they made Vista and subsequently failed to improve it significantly, and the Games for Windows initiative didn’t help either. (It seemed to evil, and furthermore it was pushing games on Windows Vista, the last thing I would want to run games on. Well ok, not the last.)

Their continuing advertisements, as vital they may have been to the company’s product line, really ruined their crediblity with me and many other technologically adept people, since they doggedly continued to insist that Vista was just fine, nothing was wrong with it and people aren’t buying our product for reasons that don’t apply any more. The fixes they applied to the OS were still sadly inadequate, however, and the Mojave project only showed how fancy and easy to use their shiny new interface was without revealing anything about its performance problems. We already knew Vista looks pretty nice to use; that doesn’t change the fact that it’s simply a resource hog that chops the very heart out of your high-performance applications and messily eats it, spilling all over what parts of your RAM it isn’t already using.

With all this going, I found it nearly impossible to believe thta they were actually making improvements over Windows Vista as they claimed. When it was announced that they weren’t writing it from the ground up but merely modifying the Vista base, I essentially lost hope. Early alphas were also essentially Vista only slower, uglier, and fraught with worrying compatibility issues.

So, this comes to me personally as a great and welcome surprise. I will now assume a stance of cautious optimism in the face of future and pray that the great M$ does not continue on to botch and murder what could have been their greatest achievement in a decade.

Connway there will probably be putting up some numbers from his testing, playing various games and stuff in Windows 7 and XP. It’s looking quite promising, it really is.

more recent graphics

Enlighten, a newish graphical engine developed by Geomerics, has actually achieved real-time radiosity effects. It simulates light scattering off of lit surfaces and lighting other surfaces that are not directly in the light, which then  light other surfaces… a process which Geomerics claims continues effectively ad infinitum with their engine.

It’s quite impressive, and I recommend that you give it a look on their media page.

In other somewhat-related news, VLC Media Player’s newest version finally has good-looking subtitles. For anyone still using 0.8.6, now is a perfectly good time to upgrade to the newest version0.9.8a at the time of this writing.

It’s been over three years since I first used VLC, and the whole time I’ve wished that it had more powerful subtitles. Now it features a pretty powerful engine with proper layout, colors, and fonts as dictated by some of the most recent subtitle formats. I thought this day would never come.

In fact, I had just installed the newest version and was chatting with a friend about this very feature, pondering if and perhaps when it would ever be made. I was somewhat doubtful…and then BAM they’d actually done it. Pretty crazy.

danger: technical stuff about the future!

An article posted only a couple hours ago on Ars Technica regarding memory bandwidth vs. many-core computing made me raise their eyebrows. The gist of the article is, as computers are now, adding more cores to the processor will not continue to be beneficial since there is a limit to how fast information can be read or written to the RAM. As you add more CPU cores, each one gets less and less memory bandwidth; the number of CPU cores has been increasing faster than memory bandwidth lately.

So, the article raises a valid point: the number of cores per socket is effectively limited in a practical sense by the total rate of memory access. I believe the problem with this argument—and the reason that we will be seeing vastly more than 16 cores in the future—is that you can always just split everything up.

You can put fast and compact DRAM on-die, right there on the processor die, as another level of cache (with maybe 128MB of memory per core? That’s not too hard). You can have multiple sockets: as massively multicore becomes the order of the day, more powerful computers will have appropriately more CPU sockets; the space and importance assigned to card slots may well lessen, as well.

And just giving an individual memory space to each processor isn’t such a bad idea either. So, say you have 128 processor cores, each one with maybe 128 or 512MB of dedicated, independent memory (perhaps on-chip)—you can then just have another 32GB or so of core memory shared between all of them and everything works out great. You can assign large chunks of work to each core, and the total memory bandwidth is off the charts.

There should also be a way to have data read from the core memory be uploaded simultaneously to an arbitrary set of CPU core caches, to make synchronizing data sets between cores easy as pie.

The point I’m making is, the article isn’t wrong in what it’s saying, but I believe it has an improper focus. It should focus not on the naysaying, pointing out all the reasons why technology can’t move on. (Don’t be silly, technology can always move on.) The article should focus on what should be changed about current computing architecture to adapt in the future.

edit: Basically, don’t freak out because current computing architecture has limitations. There is always some major aspect of the way computers are built that has to change next. If we’d done it perfect the first time, we would all be demi-gods ascended above the pithy material plane by now.

tl;dr – Don’t say that computers are limited because there are bottlenecks. Just make a whole ton of little computers and put them in the same box all wired together. Only, you know, with advanced technology. So that they go fast. And stuff.