Will-Harris House

Are you using the software that's best for you? Are you using your software as well as it can be used? Those two questions have been nagging at me to the point where I couldn't stop my fingers from finding a keyboard and pounding out a few hundred words on the subject.

If you're not using the best software for you or your particular job it may not be your fault-many product reviews suck. They don't tell you what you need to know and often leave you feeling as if you've read the back of the package-not an informed review.

Bad Reviews

The most common threads that bind bad reviews are: The reviewer doesn't know how to use the program; the reviewer didn't actually use the program; the reviewer knows and likes another program so much better that they would never like this program no matter how much better it might be; the reviewer has never done the task that the program solves (be it DTP or accounting) so they're not qualified to judge how well the program performs its tasks.

Steve (Fred) Roth, great editor and fellow columnist that he is, once told me that bad editors think if the author knows nothing he or she will be better equipped to explain a product to other people who know nothing. In the end, all you really get is a review that says nothing.

Of course, the author isn't always to blame-sometimes they're working with editors who don't know a mouse from a rat, or "fact-checkers" who believe an uninformed PR person instead of an informed reviewer.

You know it's a bad review if it just lists features. Quite frankly, it doesn't matter how many features a program has if they're hard to use, slow, buggy, or badly designed. Most reviews that dwell on features are written by people who never loaded the software on their computer and have written the review from press releases and the bulleted list printed on the box.

You know it's a bad review if, after reading it, you don't know what the strong and weak parts of the product are. Do you know what it's good for and what it's bad at? Do you know when you would want to use it, and what you'd want to use it for? These are absolutely basic questions which many reviews simply fail to answer. In the best cases you should feel as if you've had a personal demonstration of the product.

You know it's a bad review if the author confuses "ease-of-learning" (which reviewers love) with "ease-of-use," (which many reviewers rarely get to the point of understanding). It can be difficult and time-consuming to actually learn a program, so many reviewers don't.

You know it's a bad review if it only tells how fast a program completed a task. Speed is relative. A program may be incredibly fast at printing, but incredibly slow at getting the pages to the point where you print them. Which would you prefer? It doesn't matter if a program is lightning fast at something esoteric if it takes 6 keystrokes to perform a common task. Yet this kind of reviewing requires a knowledge of more than one program and an understanding of what users actually do most often.

You know it's a bad review if it uses the same canned illustrations you've seen for the product in every other review in every other magazine. If an author hasn't taken enough time to create real documents or enter real data, they may not have the faintest idea what the program is like. Some programs demo fabulously well, then you sit in front of them and feel as if you're being punished.

You know it's a bad review if the product hasn't been compared to at least one other program. Programs don't exist in a vacuum, so how did it compare with another program? Finally, you know it's a bad review if there isn't at least one negative point in a positive review, or one positive item in a negative review.

Biased Tests

But perhaps the thing that steams me most is the thing that's hardest for you to detect unless you already know the products being reviewed: bias. Let's face it, software is subjective. Some people love PageMaker and loathe Ventura. Some people love Ventura and loathe PageMaker. Other people adore Xpress and hate PageMaker, etc.

No test, no matter how scientific, is designed to find anything but what it's looking for. In fact, I find that the magazines that run the most pseudo-scientific tests often come up with the most flawed results. Why? Because not everything can be quantified and because often the most important part of a program is how it works and how it feels, not just what it does or how fast it does it.

And then there are the truly unfair "lab" tests that are consciously or unconsciously biased to one program or another. It's easy to do that-just find the unique features of one program and build them into the test. Use a lot of drop caps, ruling lines, automatic reverses and bullets, pagination controls, and you favor Ventura. Use a lot of rotation, font compression/expansion, color, irregular text wraps, and you favor Xpress or PageMaker. Different programs are different, yet testing must be fair to all. The tests should be based on the tasks the programs were designed for-not the tasks one specific program was designed for.

If these "scientific" reviewers were writing about fruit they might say, "While the orange was very colorful and flavorful, it didn't look or taste much like an apple, so we can't recommend it." You find what you're looking for.

Face it, all programs have strengths and weaknesses, and that's what reviews should tell you. Unfortunately, many don't. When they don't, ignore them and find one that does-otherwise you won't be sure that the software they're reviewing is right for you.

Lazy Users

Now that I've alienated some of my colleagues, I'm going to press my luck and tell you what I think of you. Well, not you perhaps, but that idiot to your right.

Why don't you spend the time it takes to really learn your software? Huh? I'm consistently shocked and appalled at how little many people know about the software they use daily. Some people, of course, know every intimate detail of their software to the point of being productive, if not annoying. But many people, including a big hunk of so-called "power users" (basically just people who know more than anyone else they know), aren't making the most of their software's capabilities.

I have a personal theory that 90% of users only use 10% of their software (and perhaps only 5% of their brain). These are the people who do things manually that their software could do for them. These are the people who sit in front of their page composition program and manually swipe over text, changing type styles and fonts when they could automate the process in a word processor and import the results.

These are the people who do everything from scratch every time when they could create reusable templates and spend their time on better things than reinventing the wheel. These are the people who would spend $20 on long-distance tech-support-hold-hell, rather than spend two seconds looking in the index of their manual, or, even better yet, searching for the answer in the programs' on-line help. How do you people get anything accomplished?

I know software is complicated, and getting more complex every day. I know it takes time to learn a program and I'm not saying you should know everything overnight. But I am saying that you should select every single menu item and see what it does; explore every single dialog box and see if there are things you didn't even know your program did; press every key on your keyboard and see if there aren't shortcuts you overlooked. In short, spend an hour, once, to save an hour-every day.

~~DWH


(Headline Typeface: Backspacer Round from Emigre)


Will-Harris House: Home > Writing > Design > Type > Toni > Store > Kitchen