Mike Schaeffer's Weblog
Fri, 27 Jan 2006
WMI and temperature probes...
I've spent a little more time spelunking around Win32's support for power and thermal management hardware. It seems like it should be possible to use Windows API calls to determine the presence of hardware temperature sensors and sample their current readings. As it turns out, with Windows Management Instrumentation (WMI), half of this is possible.

Quoting MSDN, "Windows Management Instrumentation (WMI) is a component of the Windows operating system that provides management information and control in an enterprise environment. Administrators can use WMI to query and set information on desktop systems, applications, networks, and other enterprise components. Developers can use WMI to create event monitoring applications that alert users when important incidents occur." Effectively, what that means is that there's a collection of COM objects that allow you to discover the hardware and software configuration of your local computer. With DCOM, it's possible to use this over the network to discover the same stuff on a remote machine. I'm guessing the intent is that the administrator of a server farm can use WMI services to aggregate statistics on her charges.

Reading the WMI documentation, one of the classes of information WMI makes available is Win32_TemperatureProbe, , which "represents the properties of a temperature sensor (electronic thermometer)." Had I read further, I would have also read the following and saved myself some time: "current implementations of WMI do not populate the CurrentReading property", but that's beside the point: this road gets more interesting before hitting that particular dead end. Doing some research on WMI and scripting led to a nice tutorial on WMI at the 4 Guys From Rolla website. From that, it was pretty easy to piece together this little piece of code that dumps data from arbitrary WMI classes:

wscript.echo "Temperature, version 0.1"

sub ShowServices(vClass)
  'Declare our needed variables...
  Dim objLocator, objService, objWEBMCol
  Dim objWEBM, objProp, propitem, objItem, str

  Set objLocator = _
  Set objService = _
     objLocator.ConnectServer() ' Connect to local PC

  Set objWEBM = objService.get(vclass) 
  Set objWEBMCol = objWEBM.Instances_ 
  Set objProp = objWebm.properties_ 

  For Each propItem in objProp
    str = propItem.Name

    For Each objItem in objWEBMCol 
       str = str & ", " & Eval("objItem." & propItem.Name)

    wscript.echo str
end sub

ShowServices "Win32_TemperatureProbe"
Dump that script into a .vbs file, run it with cscript, and it'll write out the state of the objects of the specified class. Since Windows doesn't report temperature readings, Win32_TemperatureProbe isn't all hat useful, but you ought to try it with something like Win32_Process or Win32_NetworkAdapter.

reddit this! Digg Me!

[/tech/win32] permanent link

Reading Cyrille Berger's blog has been quite interesting lately. He's been slowly plugging away at adding features to Krita, KDE's graphics editor. Krita now has 16-bit color, HDR images, CMYK color space, as well as LAB color. This is stuff that Gimp won't do until GEGL is ready and integrated. Maybe it's time to start dual booting Linux again...

reddit this! Digg Me!

[/tech/general] permanent link

Wed, 25 Jan 2006
Intel's 45 nanometer process...
Intel has released pictures of test chips made with its new 45 nanometer process. For those of you keeping score at home, that means it has transitors 4-5,000 times smaller than those on the original 8088. Look at it another way: the 30,000 transistors used in that old chip can now be made to fit in the same space as 6 of the transistors actually used.

45 nanometer is apparently the second generation of Immersion Lithography, which "has its roots in the proven technology of immersion microscopy". My grandfather used oil immersion lenses on his optical microscope (he was a microbiologist) to step up the magnification to x2-3,000.

reddit this! Digg Me!

[/tech/general] permanent link

Mon, 23 Jan 2006
Windows.h is wierd...
I've recently spent some time experimenting with the CallNtPowerInformation Win32 API call. If you are not familiar with this call, it's a Windows NT specific call that provides access to the power management related features of the OS. Among other things, it allows the current CPU frequency and battery charge to be retrieved. Like many other Win32 API's, CallNtPowerInformation has a very general prototype (notice the two LPVOID's for input and output):

NTSTATUS CallNtPowerInformation(
  PVOID lpInputBuffer,
  ULONG nInputBufferSize,
  PVOID lpOutputBuffer,
  ULONG nOutputBufferSize

To use CallNtPowerInformation, the InformationLevel argument specifies one of a number of different possible function codes. Some of these update power management settings, some retrieve current settings, and some retrieve system status values. Based on the function code, you provide input and output arguments via standard structures passed in via lpInputBuffer and lpOutputBuffer.

Where things might start to get odd is when you try to use the ProcessorInformation information level. This information level requires an output buffer of type PROCESSOR_POWER_INFORMATION. However, quoting from the MSDN documentation: "Note that this structure definition was accidentally omitted from Winnt.h. This error will be corrected in the future. In the meantime, to compile your application, include the structure definition contained in this topic in your source code." Peachy.

Being the dilligant programmer I know you are, you will, of course, want to check your return value when you call this function. Believe it or not, things are still wierd. To get the definition of the NTSTATUS typedef, you need to include winternl.h. To get the complete set of return codes, you need to include ntstatus.h. However, if you include both ntstatus.h and windows.h you get warnings about duplicate preprocessor definitions. This is because some of these constants are defined in both header files. To solve this little problem, you need to define WIN32_NO_STATUS before including windows.h and undefine it before including ntstatus.h. This tells windows.h not to define return codes and reenables return code definition for ntstatus.h.

The next problem you're likely to face is the fact that your program fails to link. This is because the powrprof.h does not explicitly specify C function linkage. If you include the header file unadorned in a C++ program, it'll assume C++ linkage, and try to call the API with a mangled name. This does not work, so you're forced to explicltly specify C linkage for the include file. The net result of all these complications might well end up looking like so:

#define WIN32_NO_STATUS
#include <windows.h>
#undef WIN32_NO_STATUS

#include <ntstatus.h>
#include <winnt.h>

extern "C" {
	#include <powrprof.h>

#include <winternl.h>

I'm not honestly sure why this had to be quite this complicated...

reddit this! Digg Me!

[/tech/win32] permanent link

Fri, 20 Jan 2006
Memory bandwidth and modern processors...
I saw this over on the STREAM website.

It's yet another perspective on the importance of caching to software performance.

reddit this! Digg Me!

[/tech/general] permanent link

I hope I never get this cynical about our industry...
"Look, the tech industry is and always will be fucked up. They still somehow manage to make a semi-usable product every once in a while. My Mac is slow as a dog, even though it has two CPUs and cost $5000, but I use it anyway because it's prettier and slightly more fun than the crap Microsoft and Dell ship. But give me a reason to switch, even a small one, and I'm outta here. "Dave Winer.

If you don't know who Dave Winer is, he pioneered the development of outlining software back in the 80's, developed one of the first system scripting tools for the Macintosh, invented XML-RPC, and invented the RSS specification you're probably using to read this blog post. I'm not trying to belittle this guy's point of view, but he's been responsible for developing several major pieces of consumer application software, designed a couple hugely significant internet procotols, and made some signficant money in the process. Most people should be so lucky.

reddit this! Digg Me!

[/tech/general] permanent link

Fri, 13 Jan 2006
The End of an Era...
Nikon has just announced its intent to discontinue production of most of its non-Digital cameras. In a digital world, this isn't too suprising, but for someone who grew up with Nikon film cameras, it's a bittersweet thing.

One notable thing about the announcement is that it also includes lenses for large format cameras and enlargers. These are both low volume but highly significant markets. Large format cameras are view cameras, cameras that offer incredible control over perspective as well as the ability to use very large film for very high resolution. A large format camera might have 80 times the negative size of a 35mm SLR: in digital terms that is roughly equivalent to 80 times the number of pixels. Large format cameras are what Ansel Adams used and represent the very highest performance film cameras. If there was one area where film was likely to offer unique value over digital, this was it.

reddit this! Digg Me!

[/tech/photo] permanent link

Tue, 10 Jan 2006
Pentium Chronicles, Part 2
I finished Pentium Chronicles on the train the other day. Given that the full title of the book is Pentium Chronicles: The People, Passion, and Politics Behind Intel's Landmark Chips, I have to say that I expected something entirely different than I got. What I had thought would be a narrative discussion of Intel's development of the P6 core is really something else entirely: a book on large scale project management techniques, using a few specific examples from the P6 project. While there's nothing wrong with that kind of book (it's basically what Fred Brooks did with The Mythical Man Month.), Dr. Colwell seems more tentatitve when he decided what kind of book to write.

Early on in the preface, he basically announces his tentativeness when he explicitly states that he won't be offering many of his opinions because it stretches his "plausibile deniability" safety net too far. To me, that's emblematic of the biggest shortcoming of the book as an engineering/project management reference. Engineers working with their own "Plausible deniability" in mind don't produce good results: they work to redirect blame rather than improve the product they're working on. Dr. Colwell knows this, he even wrote about it in the book. With all that in mind, I can't help but wonder what the book would have been like if it had been written less with plausible deniability in mind.

For people reading this blog who are wondering if they should actually read the book, my answer is yes. However, it's important to go into the book with the right expectations. If you go in expecting something like Soul of a New Machine, you'll be disappointed.

reddit this! Digg Me!

[/tech/general] permanent link

Fri, 06 Jan 2006
TiVo Series 2 and Wifi
My wife and I recently bought a TiVo Series2 for ourselves, as well as a second one as a gift for my parents. Both are set up to use WiFi as their connection to the TiVo home office. Both were a pain in the !#$#@ to set up for WiFi.

As you might expect, WiFi on the TiVo is a huge boon: not only can the TiVo download scheduling information without being connected to the phone, it can also communicate with PC's on your local network. TiVo provides a program that runs on your PC, TiVo desktop, that allows it to share MP3 files and pictures with the TiVo box itself. With that setup, you can play MP3's over your TV (or stereo) and browse digital pictures using your TiVo remote control. It's a wonderful, wonderful feature.

The dark cloud around this silver lining is the fact that the design of the TiVo box makes it difficult to find a WiFi adapater that actually works with the TiVo. There is no Ethernet port on the back of the TiVO, so you have to use a USB WiFi adapter to connecte it to the network. Maybe its the fact that TiVo runs Linux, but for whatever reason the TiVo is very, very picky about which WiFi adapters work and which don't. Fortunantly, they provide a list of supported adapters: read it (all of it), live it, love it. TiVo has also started selling their own adapter, which might be the simplest way to get started. It's not even all that expensive ($50).

The other thing to be aware of is that the TiVo boxes that are currently shipping (eg: both of the ones we bought in the last few months) are running TiVo OS version 5.x, and the WiFi adpaters we used weren't supported until version 7.2.1. I don't know why they're shipping TiVo's with OS's that are 2 major revisions out of date, but there it is: you need to update your brand new TiVo to get current WiFi support. To get the new update, you need to have your TiVo wired into the Phone as part of the initial startup. TiVo will download the OS update when it connects to the home office (you can explicitly ask it to connect, which seems to work for triggering the update). Once you get the updated firmware, you can set up the networking, axe the phone line, and bring your TiVo's connectivity out of the early-90's.

Once set up, WiFi seems to work very reliably: we haven't had any trouble. The only real remaining issue we're working through is that my wife uses a VPN to log into her work on the same PC we're using to run the TiVo desktop server. (Can you see where this is going?) Of course when the VPN is up it keeps the TiVo from seeing TiVo desktop and accesing our MP3 files. There are a couple approaches to solve this, but haven't done anything about it yet.

One more thing: the TiVo Series 3, announced today, has 10/100Base-T Ethernet on the back panel. Now there's a good reason not to pick the TiVo $300 "Lifetime of the box" service plan.

reddit this! Digg Me!

[/tech/tips] permanent link

Thu, 05 Jan 2006
Hook 'Em Horns...
My alma mater, The University of Texas at Austin, won the national NCAA football championship last night. After 35 years, balance has been restored to the world...

I just wish I was in Austin tonight, the Tower will be in full regalia. Barring that, a live Tower cam will have to do.

reddit this! Digg Me!

[/personal] permanent link

Pentium Chronicles
I was roaming through the computer section of the University of Pennsylvania bookstore and ran across Pentium Chronicles, a 2006 book talking about experiences designing the P6 processor core used in the Pentium Pro, II, III, and Centrino. The author, Robert P. Collwell, was basically made employee number 1 on the P6 program when he was hired into Intel and given the assignment to "double the performance of the P5 on the same process." Of course, now, 15 years after that fateful assignment, it's pretty clear how influential the design produced by that program has been: it gave Intel a presence in the server and workstation markets, and it's still overshadowing its immediate sucessor, the Pentium4. Even if the project hadn't been that successful, the first 20% of Dr. Collwell's book has me convinved that it'd have been an interesting read anyway.

At the opposite end of the spectrum is Kerry Nietz's book, FoxTales. As much as Pentium Chronicles was the view from the top, the perspective of a very senior architect at Intel on a huge, industry-wide project, FoxTales is the opposite: the perspective of a fresh out of school programmer working on his first niche market shrink wrapped software package. If anything, that means it's much more likely to be relevant to people with the time to read this blog: it certainly brought back memories of the first years of my career.

The best thing about both of these books is that they are both cheap and short. You can probably read them both for <$50 and 10-20 hours of time, all of which would be well spent.

reddit this! Digg Me!

[/tech/general] permanent link

Wed, 04 Jan 2006
Sleepers, Awake.
Now that the new year has begun, the holiday season lull in blog posts has started to abate. I had wondered if posts would go down in frequency or up, since people had more time to dedicate to 'extracurricular' activities. Now we (I?) know.

The next interesting question is that since blog posts seem like they might be inversely correlated with vacation time, is blogging actually actively sponsored by employers? That is, are there firms where a high profile as a blogger will lead to things like good performance reviews, raises, and good assignments?

Even at the most open minded, technically progressive firms I've worked at, I am 100% positive that blogging activities would have been considered a distant priority behind shipping and supporting product. Maybe it's old fashioned of me to say it, but that seems like the correct approach.

reddit this! Digg Me!

[/tech/general] permanent link

Tue, 03 Jan 2006
vCalc, what's next?
Despite outward appearances, there is (really!) another release of vCalc in the works. I'm not going to be so silly as to speak to a timeline (probably 2006Q2), but here's a brief list of planned features for the next release or two:
  • Constant Library - A library of a few hundred constants.
  • Interface improvements - The current UI is functional but rather plain both in appearance and the interactivity it supports. The next version of vCalc will dress up the UI bit and start the process of making it more interactive.
  • Macro Recorder - To aid programming, there's a macro recorder that records sequences of commands as programs written in the language described above.
  • New Data Types - There are more first-class data types, including complex numbers, lists, tagged numbers, and programs.
  • User Programability - There's a user programming language including conditional branches, loops, and higher order functions. This language looks a lot like a lexically scoped variant of RPL, the language used by HP in its more modern calculators.
  • Better interoperability with other data sources - This means import/export of CSV, through both the clipboard and by file.
  • Financial Math - This is mainly planned to be Time Value of Money. There's actually interface in vCalc 1.0 to support this functionality, but I released vCalc before getting it to work reliably and disabled the code that implements it. This is going to be an ongoing area for devevlopment.
  • Infix notation - There needs to be a way to enter an expression like 'sin(x)'. This is both a programmability feature and the core of things like symbolic algebra and calculus.
  • Graphics - Function plotting.
In a more general sense, there are a few other issues that are important, but have a slightly lower priority level. These are general issues that are too big to be 'fixed' in one release, but nonetheless are important areas for work. The first of these is performance and the second is openness.

Performance is the easier of the two issues to describe: I want vCalc to be usable to interactively perform simple (mean, max, min, linear regrssion, historgram, etc.) analysis of datasets with 100K-1,000K observations of 10-20 variables each. The worst case scenario means that vCalc needs to be able to manage a in-memory image around 500-600MB in size and be able to compute 20-30M floating point operations within 5-10 seconds. That's a stretch for vCalc, but I think it's doable within a year or two. Right now, development copies of vCalc can reasonably manage 100K observations of 50 variables each. The biggest weakness is the CSV file importer, which is glacially slow: it reads CSV files at around 30K/second. I'll speak to these issuses in more detail later on, but the fix for this will be a staged rewrite of the Lisp engine and garbage collector at the core of vCalc.

The other issue that will have to be fixed over time is the issue of openness. One of the things I'd like this blog to be is a way to communicate with the audience of vCalc users. That means example code, demonstrations of how to use vCalc to solve specific problems, and descriptions of the guts of vCalc, at the very least. For that to be useful, there needs to be an audience, and for there to be an audience the vCalc bits need to be availble for people to use and good enough for them to care about using it. There's a lot to be done between here and there, but I've come to believe that open sourcing parts of vCalc and releasing more frequent development builds of the closed source parts will end up being key. We'll see.

reddit this! Digg Me!

[/tech/ectworks/vcalc] permanent link

Happy New Year, 2006!
It's hard to believe that it's been six years since people were buying out stocks of portable generators and predicting the end of the civilized world. Still, there it is. One of the more interesting theories I've heard about the long term impacts of the Y2K scare is presented by Thomas Friedman in his book The World is Flat. In it, he describes Y2K as one of the ways Indian software houses first established themselves as a credible way to develop software. If that's true, then maybe there's an element of truth to what the Y2K 'doomsayers' were claiming six years ago. However, rather than the end of the civilized world, Y2K might have just signaled the beginning of the decline of pure software development as a viable American middle class career.

On a lighter note, every new year needs resolutions, and here are those of mine that are appropriate to this weblog.
  • More posts. Better Content.
  • The blog needs a few more features. Namely, it needs a way to look at historical posts, as well as a way to post comments or send feedback.
  • There will be a release of a new version of vCalc this year.
  • There will be a release of a new version of NoiseMaker this year.

reddit this! Digg Me!

[] permanent link