Tom Lachecki

(Tomalak Geret'kal)


HQ Shamelessness from MS

Have you seen the new "I'm A PC" commercial?

Meet Lauren. Lauren wants to find a laptop for under US$1000, with a 17" screen and a comfortable keyboard amongst other criteria.

She's promised by anonymous voiceover man (who's graphically accompanied by a contemporary Windows logo) that if she can find what she's looking for, they'll buy it for her.

So off she goes; first to the "Mac store". Without even showing us the store, she comes out and states that the only laptop under $1000 inside has a 13" screen. Dejected and disappointed, she reckons she's just "not cool enough to be a Mac person".

Then we find ourselves in Best Buy, the camera dollying down rows of shiny new laptops, Lauren grinning everywhere she goes. She finds exactly what she wants: an HP Pavilion with 4GB of RAM and a 250GB hard disk. It appears to be running Vista. And it only costs $699.99! So she buys it and goes home all happy.

Now, it's a well-produced video. The actress is photogenic enough and it's pretty decent marketing especially when you consider Microsoft's most recent not-so-successful offerings.

So it's too bad the whole thing is utterly shameless. They don't even bother hiding the fact that the girl got paid $700, on camera, to say nice things about a computer running Windows. And even this is only in the unlikely event that she's not simply an actress picked out of drama school getting paid in cash.

"I'm a PC, and I got just what I wanted [for free]."

It's such a waste of decent production values, if you ask me. Catch the full video after the jump.

Bootnote

And yes, I realise that by posting this commentary I'm generating more awareness of the video. Frankly, though, who hasn't heard of Microsoft?

Tags: , ,
Permalink | 1 Comment  
Tomalak's Tuesday Tip #4: Conventional Thinking

I'm occasionally asked why C++ programmers conventionally use .cpp and .hpp files, what they use them for and what happens if they don't. On the spot I'll usually come out with the template answer that conventions exist for a reason, but I thought I might as well take a moment to explain more fully the reasoning behind this one.

To understand the pitfalls of straying from convention, one must first grasp the complexity of compiling a C++ project. In fact, when you turn your source code into an executable, you are not merely compiling it. Many steps are involved, and I've attempted to illustrate them below.

(note: I have not included iostream in the above code, as when pre-processed it would expand to many thousands of lines of code. However, you would need to do so in order to successfully compile due to the reference to cout. Also, it doesn't really matter what the files are called; they don't even have to have the .cpp and .hpp extensions, although graphical code editors prefer it.)

Notice that a.cpp uses a function void b() that's defined in b.cpp, but that right up until linkage it has no knowledge of its contents. The compiled contents of a.cpp have no idea what the compiled contents of b.cpp do.

This is what the linker does. It takes the assembly code of each compiled module and joins up all the function calls. It makes sure that when a.cpp asks for void b(), that function is somewhere in one of the other modules. In this case, it is: it's in the compiled b.cpp.

Function linkage

What would happen if b.cpp weren't included in the project? If it weren't compiled and subsequently linked in to the finished executable?

You get a linker error, telling you that the function definition couldn't be found and the executable couldn't be created.

So if void b() wasn't defined, why did compilation succeed?

Declaration vs Definition

The compilation stage succeeded because, until linkage, all the compiler needs to know is that void b() exists, somewhere. It doesn't need to know what the function does, but it needs to know that somewhere it exists and that the linker will take care of the rest at the very end of the process.

This is why our processed code has "void b();" at the top. This is called a declaration and says, "you may use this function; it's defined somewhere else". You can have as many declarations for the same function as you like ("you already know this, but you may use this function") but only one definition.

Why is this all so complicated? Consider this case:

Both a.cpp and b.cpp want to use the same function void c(int x), which is defined in c.cpp. We could simply have two versions of the same function, but that would mean updating both copies whenever we wanted to make a simple change to it. This is a trivial example, but as projects grow and code becomes more complex, it's extremely common to find that more than one module wants access to the same function.

Defining this function in a header file and including that header in a.cpp and b.cpp would result in trouble:

Linker finds two matching functions in compilation scope

When linking up call to void c(int x) and the actual definition of it, the linker finds that it can see multiple to choose from. It can't simply ignore all but one definition because they might differ.

But because we are allowed to declare functions as often as possible, we can simply declare void c(int x) in a header file, include it wherever we might need it and let the linker locate the single definition, found in c.cpp.

Conclusion

It's very difficult to explain why this convention is in many cases the best way out of dependency hell, but hopefully I've given an impression of how the C++ compilation/linkage process works and demonstrated that declaring in headers, defining in source is a very good rule of thumb to follow.

Bootnote

I can't be assed to fix this now as it's unrelated to the purpose of this post, but can anyone spot the bug in my last couple of examples?

Tags: , , ,
Permalink | No Comments  
Redmond's Font of Fail

OK, mini-gripe time.

See this old dialog box from Windows 3.1? I'm getting flashbacks now; I actually kind of miss these dodgy old interfaces. This one was for picking new fonts to install, back when you always had to do it manually.

It seems familiar, though. That's because it hasn't changed in any incarnation of Windows up to the present day. That's right! Observe:

Windows 95/98/ME/2000

Windows XP

Windows Vista

Slightly ridiculous, no? Admittedly it's not "broken", and it's not really causing much of a problem. But, given the extent to which Microsoft like to fiddle with GUIs these days, the mere fact that they decided to let this gem survive right through to 2008 could demonstrate a lack of quality control. Along with some other examples, of course.

And, if nothing else, significant variations in the look and feel of a GUI can hinder a workflow. Design rules: broken.

Bootnote

Bonus image: check out this gem from Windows XP!

Tags: ,
Permalink | No Comments  
Tomalak's Tuesday Tip #3: When A Macro Is Not A Macro

A question came up on IRC today regarding preprocessor macros and which ones are standard. Amazingly, this MSDN article does a good job of describing the standard macros and lists those non-standard but conventional macros implemented in Visual Studio.

What it doesn't do is mention that some of those macros in fact aren't macros at all, but implicitly defined variables. It goes so far as to mention that compiling with /E or /EP parameters (equivalent to g++ -E or | cpp, running the preprocessor over source code and doing nothing else) will not expand the 'macros', but it is still very misleading as to why.

This is not valid code so compilation would fail, but after running the preprocessor you get the following output:

This demonstrates that __FUNCTION__ isn't actually a preprocessor macro, but an implicitly defined variable (in fact, a const char* though this example doesn't show it). __FILE__ and __LINE__ are macros and as such are evaluated by the preprocessor.

Tags: , , ,
Permalink | No Comments