Quality is dead in computing. Been dead a while, but like some tech’d up version of Weekend at Bernie’s, software purveyors are dressing up its corpse to make us believe computers can bring us joy and salvation.
You know it’s dead, too, don’t you? You long ago stopped expecting anything to just work on your desktop, right? Same here. But the rot has really set in. I feel as if my computer is crawling with maggots. And now it feels that way even when I buy a fresh new computer.
My impression is that up to about ten years ago most companies were still trying, in good faith, to put out a good product. But now many of them, especially the biggest ones, have completely given up. One sign of this is the outsourcing trend. Offshore companies, almost universally, are unwilling and unable to provide solid evidence of their expertise. But that doesn’t matter, because the managers offering them the work care for nothing but the hourly rate of the testers. The ability of the testers to test means nothing. In fact, bright inquisitive testers seem to be frowned upon as troublemakers.
This is my Quality is Dead hypothesis: a pleasing level of quality for end users has become too hard to achieve while demand for it has simultaneously evaporated and penalties for not achieving it are weak. The entropy caused by mindboggling change and innovation in computing has reached a point where it is extremely expensive to use traditional development and testing methods to create reasonably good products and get a reasonable return on investment. Meanwhile, user expectations of quality have been beaten out of them. When I say quality is dead, I don’t mean that it’s dying, or that it’s under threat. What I mean is that we have collectively– and rationally– ceased to expect that software normally works well, even under normal conditions. Furthermore, there is very little any one user can do about it.
(This explains how it is possible for Microsoft to release Vista with a straight face.)
I know of a major U.S. company, that recently laid off a group of more than a dozen trained, talented, and committed testers, instead outsourcing that work to a company in India that obviously does not know how to test (judging from documents shown to me). The management of this well-known American company never talked to their testers or test managers about this (according to the test manager involved and the director above him, both of whom spoke with me). Top management can’t know what they are giving up or what they are getting. They simply want to spend less on testing. When testing becomes just a symbolic ritual, any method of testing will work, as long as it looks impressive to ignorant people and doesn’t cost too much. (Exception: sometimes charging a lot for a fake service is a way to make it seem impressive.)
Please don’t get me wrong. Saving money is not a bad thing. But there are ways to spend less on testing without eviscerating the quality of our work. There are smart ways to outsource, too. What I’m talking about is that this management team obviously didn’t care. They think they can get away with it. And they can: because quality is dead.
I’m also not saying that quality is dead because people in charge are bad people. Instead what we have are systemic incentives that led us to this sorry state, much as did the incentives that resulted in favorable conditions for cholera and plague to sweep across Europe, in centuries past, or the conditions that resulted in the Great Fire of London. It took great disasters to make them improve things.
Witness today how easily the financial managers of the world are evading their responsibility for bringing down the world economy. It’s a similar deal with computing. Weak laws pertaining to quality, coupled with mass fatalism that computers are always going to be buggy, and mass acceptance of ritualistic development and testing practices make the world an unsafe place for users.
If we use computers, or deal with people who do, we are required to adapt to failure and frustration. Our tools of “productivity” suck away our time and confidence. We huddle in little groups on the technological terrain, subject to the whims and mercies of the technically elite. This is true even for members of the technically elite– because being good in one technology does not mean you have much facility with the 5,000 other technologies out there. Each of us is a helpless user, in some respect.
Want an illustration? Just look at my desktop:
- Software installation is mysterious and fragile. Can I look at any given product on my system and determine if it is properly installed and configured? No.
- Old data and old bits of applications choke my system. I no longer know for sure what can be thrown away, or where it is. I seem to have three temp folders on my system. What is in them? Why is it there?
- My task manager is littered with mysterious processes. Going through, googling each one, and cleaning them up is a whole project in and of itself.
- I once used the Autoruns tool to police my startup. Under Vista, this has become a nightmare. Looking at the Autoruns output is a little like walking into that famous warehouse in Indiana Jones. Which of the buzillion processes are really needed at startup?
- Mysterious pauses, flickers, and glitches are numerous and ephemeral. Investigating them saps too much time and energy.
- I see a dozen or two “Is it okay to run this process?” dialog boxes each day, but I never really know if it’s okay. How could I know? I click YES and hope for the best.
- I click “I Agree” to EULAs that I rarely read. What rights am I giving away? I have no idea. I’m not qualified to understand most of what’s in those contracts, except they generally disclaim responsibility for quality.
- Peripherals with proprietary drivers and formats don’t play well with each other.
- Upgrading to a new computer is now a task comparable with uprooting and moving to a new city.
- I’m sick of becoming a power user of each new software package. I want to use my time in other ways, so I remain in a state of ongoing confusion.
- I am at the mercy of confused computers and their servant who work for credit agencies, utility companies and the government.
- I have to accept that my personal data will probably be stolen from one of the many companies I do business with online.
- Proliferating online activity now results in far flung and sometimes forgotten pockets of data about me, clinging like Spanish Moss on the limbs of the Web.
Continuous, low grade confusion and irritation, occasionally spiking to impotent rage, is the daily experience of the technically savvy knowledge worker. I shudder to think what it must be like for computerphobes.
Let me give you one of many examples of what I’m talking about.
I love my Tivo. I was a Tivo customer for three years. So why am I using the Dish Network and not Tivo? The Dish Network DVR sucks. I hate you Dish Network DVR developers! I HATE YOU! HAVEN’T YOU EVER SEEN A TIVO??? DO YOU NOT CARE ABOUT USABILITY AND RELIABILITY, OR ARE YOU TOTAL INCOMPETENT IDIOTS???
I want to use a Tivo, but I can’t use it with the Dish Network. I have to use their proprietary system. I don’t want to use the Dish Network either, but DirectTV was so difficult to deal with for customer service that I refuse to be their customer any more. The guy who installed my Dish Network DVR told me that its “much better than Tivo.” The next time I see him, I want to take him by the scruff of his neck and rub his nose on the screen of my Dish Network DVR as it fails once again to record what I told it to record. You know nothing of Tivos you satellite installer guy! Do not ever criticize Tivo again!
Of all the technology I have knowingly used in the last ten years, I would say I’m most happy with the iPod, the Tivo, and the Neatworks receipt scanning system. My Blackberry has been pretty good, too. Most other things suck.
Quality is dead. What do we do about that? I have some ideas. More to come…
Antoinette says
As much as it hurts, it seems true. It did not die on it’s own, it did not die because it’s loving caregivers (us testers) didn’t tend to it lovingly and with dedication. It did not die without a fight that we took to the very end.
It was murdered. Murdered by shallow, short-sighted, ignorant, incompetent, greedy people who see only the quick win. People who don’t recognize or value the skill that testing requires. How many times have you heard “Anyone can test”? How many times have they said “Why is it taking so long” or rejected a testers input regarding design and usability. So few really understand that it is not our role solely to find bugs or to ensure that vague and incomplete requirements are instituted. We are the voice of the software…and we are not permitted to speak.
Your experience is shared by many. It should resonate with every user. But, instead of giving up, everyone should use these experiences to fight back. The value of the contribution by skilled quality and test people can be seen in every phase of development, but the true value is seen in the end product. Use the shared experiences of these bad examples to communicate to ‘them’ the importance, the value and the necessity of a concerted dedication to quality.
What do we do about it? Never surrender…
Michael M. Butler says
I won’t buy an iPod because I was unable on three separate occasions to get iTunes to work properly on my AMD-processor Windows XP PC. Either playback audio was choppy, or other mysterious things happened to my files. I have no way of knowing why it failed, but I know what’s a waste of my time. Count yourself fortunate that iTunes works for you, as I presume it does.
Ted Nelson coined the term “Cybercrud”, which from the roots relates to steering, or being steered, into crud. As more and more technical underpinnings proliferate, fewer and fewer things are immune from it.
None of this makes me feel warm and fuzzy when it comes to cloud computing, no matter how much the Motley Fool folks and others tout it.
[James’ Reply: There’s something interesting about cloud computing. The metaphor of a cloud does fit, because in the cloud, there is much that is obscure. When something goes wrong, all we can do is retry and hope for the best– as when Gmail went down recently.]
David says
I think you should find or create a forum for this subject.
Blogs quickly make everything yesterday’s news, make it hard for readers to find out what’s been newly said about any particular topic. I can’t easily scan this blog next Monday, for replies to previous entries, for ongoing dialogue in the topics that interest me. I can’t even easily notice that a topic has “woken up” or begun to attract discussion. I have to actively go and check for the traffic, as the system won’t indicate to me whether it’s there. A forum, I can simply ask it “what’s new since the last time I was here”, and see it all.
This part (blogs, RSS) of Web 2.0 is a major failure on those grounds. The blog-and-feed stifles dialogue, discourages interaction, makes things harder, intimidates the reticent, tends to end discussion from sheer inconvenience. It’s fine for top-down communication, but pretty bad for anything else. You’ll likely need a lot of bottom-up and side-to-side discussion to get very far with this, or to keep it going very long.
I don’t mean to put my soapbox on your soapbox, honest, but I’ve started about three different replies and thought, “what’s the point?” to all of them because of this.
[James’ Reply: I know. That’s why I rarely leave replies to blog entries.]
Michael M. Butler says
I’ll point out that the Satisfice blog has an RSS feed for its comments, which seems rather uncommon. I rejoice in the fact.
Michael M. Butler says
Another thing about clouds, as I learned in aviation ground school long ago: the fact that they appear relatively fixed in shape and volume at any moment can blind you to the fact that they often have extremes of airflow and precipitation inside. To a naïve person, they look semisolid. They’re anything but. Related to this, I’m glad my main link to the Internet is, though relatively high-bandwidth, somewhat erratic. It helps keep me from getting utterly complacent.
Forgive me if I let my passions get the better of me for a few moments.
One of the things good flight training is supposed to teach students is to be ready to say “Well, there it goes!” when something fails, and react usefully. The default “condition white” response of “What the F***?” followed by stuckness is not helpful.
To the extent that software and lifestyles co-direct people into the “WTF” behavioral sink, they are to be resisted. It seems to me that part of that is getting firmer-minded about what acceptable SW quality is, and another part is not lying to oneself about the likelihood of being let down.
I look forward to your follow-on posts. I’ve dropped a note in Facebook about this thread, and hope some folks there check in.
Zach Fisher says
James,
A comment to say BRAVO and that your post inspired a post of my own. Glad to see you posting again.
Zach…
Ivor says
Wow! Wow! Wow!
I can’t agree more, but also can’t rail any less!
The economic reality that the software industry faces is based on the model that has failed our financial system. Return to shareholders on their investment. If we truly want to get people to sit up and take notice we have got to make people think in different ways. Investment is not just a means to maximise profit. Shareholders are not just bystanders, but are active in the direction of the product. Software is not just a means to making oodles of cash. People can expect to make a living, not rape a product, process or company for all they can get. In short you have to care!
Testers care. Developers care, they really do! Anyone at the coalface of active involvement in making a software product come to life cares! Its when you have layers upon layers of people who wrap that up in models of returns and efficiencies and progressively squeez the living crap out of those who care that you get the kind of cluster$%^& that results in despair and despondency from the likes of a great advocate such as James.
Quality is not dead! As long as we hold onto a single shred of empathy in our noggins, one single shred of decency, quality is not dead. Its just on holidays in Bermuda, waiting for the call, that we will make to bring it back front and center where it belongs. In front of our customers!
Please, oh please, don’t let go of these threads. We just have to look at new ways to weave our tapestries.
[James’ Reply: I am composing a new post on what I think we can do about the problem. It’s nice that there’s passion about this subject.]
Ivor says
Oops, premature submission there!
I would suggest that if you want to see a model of someone who meets the criteria I have described above, check out Dr. Ben Goldacre’s Bad Science blog and associated blogs.
http://www.badscience.net/
He has gotten into peoples heads about a multitude of issues, MMR vaccine and Autism, Nutrionism, Homeopathy, bad statistics etc. in relation to the everyday effects that poor quality in science has on people.
Who knows it may be a model we can follow (royal we as a profession) to get our message across and make the wake up call to quality in Bermuda.
George Crews says
Hi James,
You have a point that quality may be underappreciated. It wasn’t hard to Google an example. In a recent (although non-scientific) survey, 40% of CIOs and senior IT executives say they “don’t care at all or very little about the quality of their companies’ software.”
However, IMHO, most experienced programmers already think quality is not a priority to a significant fraction of software management. So that’s not exactly news.
And the general state of software quality has always been as poor or even worse than it is now. We have been in the midst of a software crisis since at least 1968. (And I’ve been writing programs that long!) It’s just that so many more ordinary folk are interfacing with so much more complex software. The software quality issue is getting much more obvious and as you point out — annoying.
All-in-all, my thought is that things are much as they have always been. If I were Mr. Quality (and with apologies to Mark Twain), I would respond to your post by saying that: “The news of my death has been greatly exaggerated.”
[James’ Reply: My sense is that things are much worse than they were ten years ago. Security is worse. Desktop computing is actually slower (see Infoworld article). Digital Rights overhead mucks everything up. I believe I spend more time working around problems or just suffering with them than I ever have.
“The Software Crisis” has not been much of a quality crisis (it’s been more about schedule and cost), but it’s become that in recent years.]
Rob Lambert says
James,
I think you hit the nail on the head. It is so sad, yet so true. I now hold zero expectation that anything will work on my PC.
In fact, my anti-virus has just this minute died on me…boo. Oh well, I knew it would happen, I expected it and I doubt it will get fixed anytime soon.
I’d better post this comment now, before my system reboots for no reason….
Danny Faught says
Are you saying, as a computer user, that you want the option to pay more for better quality? Or that software organizations should be able to offer you better software by using the same resources more efficiently?
You had several Windows-specific complaints. I’d like to hear more about your reasons for choosing Windows.
(BTW, I’m copying the text of this comment to my clipboard before I try to submit it, as a routine part of my risk management because of prior bad experiences with web application quality…)
[James’ Reply: I’m saying quality is dead. I don’t have the option to pay more for it. There’s no “quality market” from which I can buy more quality.
Linux, Macintosh, and Windows all have these problems. I work with Windows mostly, because the Mac and Linux can’t do what I can do under Windows. That’s not to say I’m satisfied with Windows. I’m deeply unsatisfied.
I do have some ideas, though. I’m preparing to write about them.]
BSUGrad says
You’re wrong about the Dish DVR. I have used one for over two years now after having used Tivo for probably 5 years. I too thought I would miss Tivo, but I was wrong. The Dish DVR is a great machine. It’s not perfect, but neither is your sainted Tivo. I have never had any problems with my Dish DVR, and find it very easy to use.
[James’ Reply: I’m wrong about being frustrated with missed shows and an unintuitive user interface because YOU like the Dish DVR?]
Aaron G says
A major problem with quality is that it is both subjective and relative. There is no objective way to say that X is a good product, or even that it is better than Y. Feature lists? Bug counts? Benchmarks? All just micro-facets of the nebulous “user experience”. Are the users satisfied or just complacent? Is there really a problem or is it just a couple of loud-mouthed cranks who forgot to plug it in?
I am not proposing misguided and poorly-conceived “quality metrics” that encourage abdication of competence and awareness. That can only make the problem worse. But sales figures, average hold times, company profits, and stock prices are easy. Quality is hard and doesn’t scale so well.
The death of quality has become a largely cultural and sociological issue, no longer subject to the whims of individual corporations or penny-pinching managers. They may have helped kill it, but I don’t think they have the means to resuscitate it.
I’m looking forward to hearing your ideas.
[James’ Reply: Exactly what I’m trying to get at, Aaron. Thank you. I’m not just complaining about companies who produce bad software. The techscape is such that very few people know how to create good software– not because of their skill, but because of the entropy caused by thousands of DLLs and derelict products clogging our systems, and because they won’t be rewarded if they create it. When combined with the learned helplessness of the user base, it may lead to a mass stagnation of technical progress. In other words, a sort of tech deflation.]
Jim Hazen says
James,
As usual, very insightful. I feel the same way to a large degree. Testing has not advanced on many levels in the process and in companies strategies for producing their products. Quality is a nice to have at times (the we’ll fix it later attitude), and the overall cost impacts are not recognized or managed correctly.
My main argument is the continued naming/calling/defining of Testing as Quality Assurance. When I hear people say they are “QA” (position) or we have “QA” (function within the company) I ask them what they do. Ninety nine percent of the time time I hear “we test the software”, to which my reply is “then you are Testing, not QA. QA is a lot more than just testing.” And this leads to an argument, but by the time I get done explaining all the other facets of a Quality Assurance function (like SCM, Metrics, Risk Management, Audit, Process & Procedures, etc.) in addition to Testing the person realizes “gee, I’m just a Tester”. Which isn’t a bad thing.
As you and others have stated it does boil down to the basics of economics on a project. And it will take “hitting the wall” for some companies to wake up and smell the coffee in order to get their acts in line. Others it just won’t sink in.
It sucks for us “professional” testers because we want to make things better, but we are butting our heads on the walls at times. I guess you could say we are a bit “insane” (one definition of insanity is doing the same thing over and over again expecting a different result each time), or just plain masochists (“Thank you sir, may I have another?!”). But you have to admit at times it does pay the bills (I know, give me crap about bad attitude 😉 ).
As my blog title says (and something I once told Scott Adams)… “Testing is Irrelevant, Shipping is Futile!” (to rip off the Borg).
I look forward to the rest of the series. Best to you dude.
shrini Kulkarni says
James,
In last 5 out of 5 assessment studies where I was involved as consultant to study testing practices in organizations and suggest improvements, reducing cost of testing is a big and major objective of testing bosses. All most of all them favored the route of outsourcing and AUTOMATION. Many felt that through automation they can reduce testing cycle time there by reducing the overall cost of testing.
It seems that when you are in bad shape, any suggestion that gives the hope will become an immediate hit. Hence for many of these folks, the idea that test automation reduces testing cost sounds like a music to their ears so such so that they are not willing to question any of it. They think buying test tools, engaging an offshore vendor to create automation scripts (scripts?) will really lead them to testing cost savings. Bizarre … but true and rampant.
It is sad that they are not exploring any other low cost options like checking what could be done away with, what testing they should do, what kinds of problems are being noticed, what skills of testing they need to acquire, Do they need to execute all those 10000 test cases blindly for every cycle…. as you mentioned, there are indeed cheaper options to reduce cost.
No one seems to paying attention to these … as these are cheap ….???? IT community in these hard times of economic slow down needs to go back to drawing board, rediscover some basics of using and managing software.
Echoing the statements of Ivor – Software issues/bugs are real, losses they cause to business are real, money businesses spend to create and manage software applications is real, Need for software application quality (value to person(s) that matter) is REAL more than ever – then why quality is DEAD ? May be it is not – it is certainly in danger, in ICU or may have slipped into Comma.
There is a perception of quality being dead as those appear to be worrying about it, those apparent safe keepers of quality are presiding over so called ceremony oriented, tool based, low cost options.
Shrini
shrini Kulkarni says
Is it that people have taken “Good enough quality” too literally and are rushing to develop/test/deploy applications that work for that moment or given day? I would like to mention here specifically that developer or tester face cribs about “low” quality, they often point towards “Lack of time” to do stuff. Requirements change too fast, application platforms change, business landscape change, business models change, companies go bust overnight and new alliances are formed over week period.
With so much of flux and random events in business arena – how can we software people remain isolated from it? Quality suffers as every one seems to be in an “unknown” HURRY. “Let us fix them in next release” always works in such situations.
Where is the problem then? Are we being too lazy to adapt to changing business and technology landscape around us? How are we doing with respect our struggle in “survival of the fittest” ?
Most importantly – How are we responding to CHANGE ?
Shrini
[James’ Reply: I believe in good enough quality. But I think conversations about good enough are dying away. Part of it is that dedicated testers (whose traditional job is to raise awareness and make sure that conversation is healthy) are being thrown out of projects, replaced by drones who have no investment in the outcome, or programmers with a puppy crush on automated test tools.]
Matthew Heusser says
My neighbors are in their mid 50’s. Every couple years, they buy a new dell. Top-of-the-line – the last two were “media center” I think.
Then they call me over to get it running. I think “that should not be necessary”, but then, holy cow, I look at it.
Extra icons EVERYWHERE – games, AOL, ISPs, download music, multimedia, blah blah blah. Half the screen real estate is screwed up.
Installing MS Office takes abut an HOUR and involves a half-dozen disks and 25-character-string-code. Then there’s getting the internet up. Then configuring email … eventually I realize this is a multi-hour project for me. There’s no way they could do this on their own.
AND there is no reason it should be that hard! The Mac out-of-box experience is nothing like that.
What do I blame? Well, I think quality died because we focus so hard on measurable things (schedule, features) that quality was squeezed out. How many developers or even testers are willing to stand up and say something like “We can’t do this; it’s irresponsible, it will lead to a terrible customer experience. We are killing out customers by adding ‘one more link’ to the website a week. The experience is terrible …”
No, we’re /scared/. And, in a culture driven by hard numbers (schedule, cost) trying to express something as humanistic as “unusable experience.” doesn’t fly in general – and people are /scared/ for their jobs.
This is also why your customer service experience for your ISP sucks. I mean, it’s really bad. You wait half an hour to have a ‘bot tell you to reboot your router. After a half-dozen things you’ve allready done, you are put through to a person with an accent that tells you to … reboot your router.
It’s because the price of the customer service is easy to measure and the /service/ of the service is not. A few companies (Dell) had enough push-back to undo a big mistake in outsources customer service. Sadly, that’s the exception.
Exceptions, however, abound. In general I am very happy with the Agile/Lean software dev movement and it’s focus on not being stupid. (More polite way to say it: And it’s focus on eliminating waste and systems thinking and systems improvement.)
Tivo still exists; The iPod still exists; Mac OS X still exists. In some parts of the country there are still employers willing to take a step back, look at the product, and say “awww, that’s crap. We’ve got to redo this and do it well.”
I seek those companies out and try to work for/with them.
Joseph Ours says
James,
I almost agree with your premise, that Quality in software products is dead. However, I do disagree with your argument to present your hypothesis. Testing is a quality control function. As such, it can only tell you how a product performs against an oracle. I’ll concede that action taken based on that quality control check can result in an improved product. But in reality, testing does not inherently improve the quality of a product. Referencing layoffs in the testing department of one company, which does not appear to be replacing that group’s set of skills with an equivalent version, has nothing to do with your hypothesis. If I improve a process that improves the quality of the product and requires less quality control checks, then laying off \checkers\ or testers if you will, is an appropriate business decision.
[James’ Reply: I’m sorry it wasn’t clear. I wasn’t saying that laying off testers is an example of quality being dead, I was trying to say that replacing good testers with fake testers (to put it bluntly) is an example of that. Thoughtlessly decimating the testing staff is an example of that (it’s thoughtless because they haven’t investigated what the testers actually do or what value they provide).]
I would like to offer a slightly different hypothesis, “Quality is not dead, but lower quality products is now acceptable.” Ultimately, the quality affects the demand side of economics, but so does various types of competition. If the customer base is unhappy with the quality of products and services being offered then they have several options (let’s use the Zune as an example):
1) They can choose to still acquire the product (Zune)
2) They can choose a direct competitor’s product (iPod)
3) They can choose a substitute product (Just an MP3 player)
4) They can choose an alternate product (Turn on the radio or TV)
5) Or, they can choose to do nothing
Each of those options has their benefits and drawbacks, as well as cost. So, the quality of a product will drive a customer to an option. These options can be distorted in monopolistic industries, product lines, and companies. Therefore, so long as a company is getting enough customers to purchase their product the quality level of the product is acceptable. Therefore, customers are in part to blame for the deterioration of quality. So long as they are willing to purchase a company’s product, they are essentially saying they approve of the current quality level and price level. You mentioned that you loved Tivo, but left the DirectTV due to a lack of quality in the customer service aspect of their company. Your choice was to move to the Dish Network and an inferior DVR. You did exactly what customers do, and in doing so perpetuated the deterioration in quality. If enough James’ left DirectTV, then they may improve the quality of their products. But as far as Dish TV is concerned, their DVR is just as good as Tivo, as evident by them growing customers faster than they are leaving. Additionally, as customers become accustomed to lower quality products at their current price points, they become desensitized to expectations of higher quality products. These points is what allows Microsoft to release Vista they way it did. Ultimately, the customer base did not grow at a rate they found acceptable and were forced to address quality in other ways; service packs, Mojave campaign, etc…
I realize this is an over simplification, but I want to illustrate the point that so long as customers buy products of inferior quality, a company has no incentive to improve it. Therefore, quality is not necessarily dead, but rather these lower levels of quality are acceptable.
[James’ Reply: You are speaking up for the second part of my hypothesis. I say quality is dead because of the first part that goes with it: prohibitive entropy. It’s too hard to make products work.]
Damian says
An excellent article.
I tend to have to deal with the worst of all the worlds: management who demand the highest possible quality (without understanding what quality is, nor its cost) without grasping that quality costs money. We tend to see-saw every year or so, with the More Quality! crowd winning on odd years and the Lower Cost! team triumphing on even.
And as for your TV woes, over here in NZ we have the equivalent of the British Sky DVR. It is appalling (how hard is it to remember the shows I watch? Or be able to run for more than, say, a week without a powercycle?), but every complaint I make about it results in the electronic equivalent of a shrug. However, because the modicum of convenience it offers me is greater than the effect of the “(c)ontinuous, low grade confusion and irritation, occasionally spiking to impotent rage” it engenders, they have no compelling reason to fix any of my complaints: I have no viable alternative, unless I feel like building my own media center (I don’t) or I want to go back to the comparatively monolithic manual recording (ditto). I agree that it is the perfect example of the current consumer-corporate relationship, in that they will continue to give me bad service and an annoying product at a high price and I will continue to pay them, complaining all the while. There are various metaphors I could use here, naturally; Turkish prisons spring to mind.
[James’ Reply: Prison. Yes, like prison.]
John McConda says
Hi James, just a note on DirecTV. In my experience, they’ve dramatically improved the quality of their software in the past 3 years. It was pretty rough right after they ditched Tivo, but I think one of the reasons it has improved so much is their Cutting Edge program. It allows volunteer customers to beta test new features on their own receivers.
In fact, they’ve recently announced that they’re back on with Tivo, and many of us in the Cutting Edge community are wondering if it will be a step backward.
David says
There is no apparent return on the investment of a dollar of the budget into improving product quality. If I make a superior product, I cannot charge more for it in this market, I might not even be able to sell it at all.
xBase was vastly superior to MSAccess, yet it’s dead. Et fricking cetera. Corel. Lotus.
Those who make the purchasing decisions are not those who had to use the products, but to paraphrase an earlier wisdom, “nobody ever got fired for buying Microsoft.” So they did. Then they did it more, to be compatible with everyone else. Now, if somebody sends a WordPerfect document or an Open Office spreadsheet, we have to ask somebody how to open it. Even though OO is free, it still can’t gain significant traction against MS. Relative quality of the two products doesn’t even get considered.
THAT is a distorted market.
If I chop the testing budget, however I do that, there is no corresponding drop in sales or income, no unrecovered rise in support costs, no market mechanism to urge a correction on me. There is only my improved numbers at the end of the quarter, to reward me for my action. I might even get a note of thanks from Product Support for boosting their revenue.
This creates an anti-quality trend, a mechanism that selects for lower quality. (I cut my testing budget, I prosper and get promoted. You don’t cut yours, your bottom line looks worse than mine, you get fired or sidelined.) Its epitome is the EULA clause that denies any warranty at all, any responsibility for consequential damages, or fitness for any purpose whatsoever. You wouldn’t accept an automobile on that basis, but you do accept all of your software so.
Until market mechanisms function again and the effect of Microsoft’s monopoly wears too thin, until there’s a clear downside to poor quality, I don’t see this situation changing.
Ivor says
Hmmm….
I really expected more from a community of people who subscribe to a blog such as this.
The cynicism is unbelievable, although my wife would say it is more pragmatism, I think it displays a fatalism that we can be powerless to do something about things like this.
I’m Irish. The term Boycott was defined, promulgated and shipped worldwide from my green isle during a period in our history when we were forced to accept the norms and standards of a bloated empire (just to state the bleeding obvious here, I am a patriot and nationalist, not a republican and racist!). A group of poor tenant farmers were so outraged at their treatment by a landlord, that they decided to shun him, to isolate him and ultimately to hit him where it hurt, in his wallet!
In these days where the world stage has shrunk to the size of local theater, it is obvious what “the customer” should do. Not seek the lowest common level, accept the base level of incompetence, but should demand that the experience promised, efficiencies promised are delivered!
Vested interests are hard to break. Look at NASA, at the US DOD where these functions of a larger government and by extension the will of the people are more beholden to the industries that are meant to serve their interests.
A couple of ideas:
1. Make a stand, don’t accept less than you pay for, and always remember you pay for what you get!
2. Why don’t we link up with consumer agencies, designed and setup to protect consumer rights to offer a service to them. (Sorry James!) Certify a product, give it a health warning and have it pushed out by agencies that have teeth!.
3. Publication of formalised literature on the nature and extent of the problems faced by customers when using software can be powerful if we play the game that is in front of us and not bemoan the skill and tactical genius of the corps(e) that make the end runs around us. Again link it to the Consumer agency model. A PubMed concept that allows peer review, publication and easy access for the great unwashed.
Of course in these uncertain times, where mortgages have to be paid, children raised and yes even food eaten, we have got to pay the piper and dance to his tune. I feel a bit like “Henry the Young King” when he raised himself of his sickbed to ask “Will no one rid me of this troublesome priest?” of Thomas Becket, Archbishop of Canterbury. The nature and size of the task is such that it would require a lobby group along the lines of Earth First or the Sierra Club to make a big difference. Of course the oft quoted “From little acorns do great oaks grow” mantra is always an option
Anyway, thanks to James for allowing me this moment on the soapbox.
Bret Pettichord says
James,
I think you would really enjoy using a Mac. I got one about a year ago. A lot of people who care about quality have moved to the Mac and there are a lot of Mac applications being developed for people who care about quality. I also use Windows everyday. I run it on my Mac using vmWare Fusion, which is a totally awesome program. A quality program. (And get Time Machine too.)
Bret
Tim Coulter says
In response to David way up high:
My company is trying to solve this problem by intelligently aggregating RSS feeds by both recentness and other qualities you specify. It’s still in beta (if you can call it that) but we just had the go ahead to tell people about it. You might find it useful: http://www.melkjug.org
Now, let’s see if David actually gets this…
David McKenzie says
If you want to get a little hope back, you should get an iPhone. I don’t know how Apple does their testing, but whatever they are doing, it seems to work. iPhone + services (App Store, iTunes, iTunes store) is a pretty complex system, but it “just works” to a remarkable degree, is fun and easy to use, and is just plain beautiful as well. (The latter is an aspect of software quality that is all too frequently overlooked.)
I also think that you are giving Mac OS X short shrift. I can’t recall the last time I had an OS-related issue with my Mac. Issues with various third-party applications, yes, but a problem with the OS, no. Evidence from my personal experience is that Apple is a company that still really, really cares about getting it right.
I think that a large proportion of the market success of the iPod and iPhone is due to their superior level of design and software quality or “polish”. It is a shame that most of Apple’s competitors don’t seem to have been able to figure that out.
Tom Eble says
Regarding the Windows complaints, I think the writing is on the wall that sooner, rather than later, your desktop will be on the intarwebs. Then we’ll have a whole host of new quality issues to deal with as well…
Michael M. Butler says
No, my desktop will not be on the intarwebs. Not as presently constituted and accessed. I decline to not be able to work offline.
Michael M. Butler says
Also, regarding the iPhone, I believe it’s reported that the vast majority of the apps downloaded for it are used fewer that a half-dozen times per client. An interesting number… Software as amuse-bouche?
What is the point of all this? I sometimes think of my attitude as that of an agnostic Amishman from the 25th century… were there ever to be such a thing.
Ivo says
Consumers and companies buy cheaper software with lots of bugs over quality products. They never get the chance to notice that the higher quality product requires less maintenance, is easier to extend, etc. It seems impossible to establish yourself as a quality software manufacturer, for whom customers are willing to shell out a bit more. It’s one of the circumstances under which free market capitalism is incapable of reaching the optimal solution.
Midas says
Buy a Mac! Or install Linux. Windows is the root of your trouble.
[James’ Reply: I have a Mac, Windows, and Linux. I have the same categories of problems with all of them.]
David says
@Tim Coulter: Oy, what a name! “Melkjug”?
I score Melkjug as a clean miss. It’s still going with the blog-and-feed system, which disregards comment as unimportant. That being so, Melkjug fails at the most important part of this problem, which is to bubble up a blog topic solely because there are new comments to it.
Melkjug (I did try it) doesn’t even show me the comments, and RSS doesn’t flag new comments, even optionally, much less keep track of which ones I’ve already read. Melkjug’s tuners do not offer the option, “new comments” or “comment by a particular person”. “Starred by”, yes, “Dugg by”, yes, but not “commented by”.
Y’all couldn’t add that if you wanted to because RSS (I include Atom here) doesn’t present any information about the contents of comments at all.
I have to remember to manually dial up this topic, because my reader doesn’t pop up anything new until James writes another post. Then I have to bring up the particular post in a view that includes comments (which my reader’s view does not), and then scroll down the comments while trying to recollect where I left off so I can see which, if any, are new since my last visit.
I am going through that process this for this blog and this particular subject, but I will not do it as a matter of course or for most blogs or subjects. Neither will most people, and so the discussion dies, not from lack of interest, but because the mechanics of keeping up with it are just too cumbersome.
Mendelt Siebenga says
I fail to see how this is a new problem. I’ve used systems running Linux, Dos, Windows (from 3.1 to Vista), OSX and even Amiga OS a loooong time ago. Some systems were more “buggy” than others but I don’t see quality decreasing over time. I do see complexity increasing in line with our expectations of the system.
[James’ Reply: I’ve used all those, too, except I started with Windows 3.0 and I was also an Amiga developer. I also used the Commodore 64, Radio Shack Model 100, Vic 20, Apple II+, Apple IIe, Apple IIgs, Apple III, Mac 128K, Fat Mac, Mac SE, Mac II, Mac IIc, Mac Portable (the very first Mac Portable, which weighed something like 14 pounds), iMac, MacAir, and also a Sinclair ZX1, a Borroughs running CP/M, an Osborne, a Treo, a Magic Link PDA (which sucked), an iPaq smartphone (which toootally sucked), a Blackberry (which isn’t bad), an iTouch. I spend a lot of money on technology!
Let’s take the Amiga. When I worked with it, it had an operating system that would run about 10 minutes without crashing. It was terrible. But that computer was a toy. That computer wasn’t running our lives. Now the computers I’m talking about ARE running our lives. They are central to our lives.]
Vista’s autoruns is a nice example of this. You want to monitor vista the same way you’ve always done. You expect it to behave the same way as older versions while at the same you expect it to have more features. You expect change while at the same time you expect things to stay the same.
[James’ Reply: A complexity threshold has been crossed with Vista. As has an obscurity threshold. I rapidly gained facility with NT and XP, but Vista seems to have been designed to foil user configurability and serviceability.]
If you look at the software market as a whole I think software has kept up with the demands and expectations of the public remarkably. People expect a certain quality, products that fall short die, projects that exceed this expectation at the expense of features people want die too. Sometimes markets lag behind but usually not for long. Mobile phones and PDA’s had this problem for a few years. That market has been woken up recently and is improving rapidly.
[James’ Reply: Your experience seems quite a bit different from mine.]
James says
Glad you brought up digital TV.
A dial-up BBS running on a Commodore 64 could serve up a TV guide faster than these pieces of ****.
Scott says
James,
It seems that software is not the only place that ‘quality is dead’. Look at most of the physical stuff you can buy at any store, even the high-priced ones, and I see lots of crap. Software suffers from bloat and complexity, which makes testing more difficult and expensive. We are seeing the effects, but what are the causes?
The fundamental cause, I believe, is human nature in the modern world.
While we all want quality, we are at the same time trying to get it for the lowest price. And since quality is opaque in the sense that I can’t tell whether a particular piece of software or a particular product is of good quality beforehand, the only discriminator I have (usually) is price and the marketing information provided by the seller. True, there are reviews of products on the web and there is Consumer Reports (which I subscribe to), but I think the sheer number of products and possibilities is overwhelming us.
Apply this to public companies and quarterly reports and it’s obvious that since visible cost is a tangible, easily measurable thing while quality and hidden costs are not, then our focus within corporations and other organizations will be how to cut costs. Job mobility, even within the government, means that by the time the cost cutting impacts the bottom line (still in hidden ways), we have moved to a different role and often, a different company or organization.
Apply this to yourself and your collection of things and you might find that you, as I and most other people, overvalue ‘stuff’. Paul Graham’s essay on ‘Stuff’ is a good read.
The incentives are asymmetric due to our innate nature, how easily we are fooled, how difficult it is to see our own biases, and how they drive behavior individually and in organizations to lead to the results we see. As a shareholder of a company, do I really care about their product, or do I care about the return on my investment? Do I care how they bring me that return (esp. if I can’t see or argue the case for quality in the first place)? So what will always win out, in general, is lowest costs which eventually equates to lower quality. It’s a race to the bottom, hoping in each iteration that the revenue earned is at least higher than the real value in terms of quality of the products we produce.
Maybe I’m just pessimistic; lots of people may outright disagree with me (the number who would agree in this blog don’t matter because it’s likely that people who write responses here usually agree with your statements, so all the other disagreeing opinions self-select out and we don’t see them).
I believe there are products that are higher quality than the norm (but do I want them for less money? You bet!). I also believe there are outlier companies that do focus on better quality because they believe it will generate (hidden) returns. Toyota cars come to mind. Apple hardware too.
I don’t have time to write a quality response (or I have too many other things to do that I won’t make the time?), but here’s a list of things to read that I believe has everything to do with this topic:
A Market for Lemons (search in Google for the paper)
The Paradox of Choice (Barry Schwartz)
Fooled by Randomness, and The Black Swan both by Nicholas Taleb
Managing Software for Growth, Roy Miller (this is an EXCELLENT, unsung work).
… and more that I can’t think of right now.
/s.
[James’ Reply: If you scan the comments, you’ll see quite a few disagreers. I’ve rejected only three or four that were just too stupid to post– the equivalent of not letting someone drive drunk. I mean, one guy said I’m obviously new to computing (kids say the darndest things).
Much more significant, as you point out, are the disagreers who would never comment on my blog. No one knows how big that group is. But I don’t understand how their experience of desktop computing, each day, can be substantially different than mine. I assume they are numb to what’s happening.]
Jon says
Perhaps the Free Software movement would interest you. There is much work to be done. Join us.
[James’ Reply: Yes, that will feature prominently in my next post on this topic.]
Kevin says
As a twenty year veteran, I would beg the question, was quality really alive? The software we develop these days is at least an order of magnitude more sophisticated than it was 10 years ago. Our methods, tools and techniques for ensuring quality have advanced considerably but have not kept up with the sophistication.
In addition, I would propose that most bugs in our software these days are not introduced by the developers but introduced by the culture in which they work. Following the latest design pattern or approaches such as TDD or BDD will not fix these root cause bugs.
All the issues outlined in classics such as Mythical Man Month are just as much a problem today as they were when that great work was published.
[James’ Reply: It’s never been very healthy, but things have developed since XP was shipped that dealt death blows to quality. Several commenters have suggested that things were just as bad in the old days. No… I was there. I was able to make my desktop function most of the time. My ongoing education as a technical man kept me on top of it. But today there have been huge shifts. Digital Rights Management and security concerns have caused a general closing of the once-much-more open desktop. Software is far more bloated and interdependent than it was. Hackers have created a vast shadow-world of malware that continually assaults our systems. Nearly everything is web-aware, which often impairs local functionality as I must be online to get work done.
It takes a lot of my time to clean, defend, and maintain my desktop systems to a minimally functional level. It never used to be this hard.]
Umberto says
James, the reason software quality sucks is because making high quality software is expensive. It takes significantly more time and resources to design, build, test and certify good quality software. Someone once said that corporations are giant cost-externalizing machines. This explanation makes perfect sense: at the company where I work I hear colleagues joke about how our customers do our system testing for us.
QA staff in my department has been pared down to a few over-worked individuals. The best they can do is catch the really bad issues before we ship each release. We developers do what we can, but there’s only so much quality you can manage with limited manpower, everyone working 80 hour weeks, and the relentless pressure from the top to get the product out the door on schedule. Needless to say, those schedules are determined by considerations such as revenue recognition rather than how long it actually takes to build something well.
Our customers are likewise cost conscious corporations that work their IT departments to death getting it implemented. This is cost-externalizing gone mad – costs pushed on to everyone and everything except their damned balance sheets.
That’s why the future belongs to free, open source software.
Dan says
James,
The core of your thesis reminds me of discussions I’ve had about the “declining quality” of modern-day literature, television programming, character of political candidates, customer service, etc., etc.
If there’s no demand for “quality”, and only weak penalties for failing to achieve it, then what can we say about its value?
[James’ Reply: I think there is a demand for it, in the abstract. Yes, there are people who don’t mind all the problems– the lost time and lost data. Perhaps those are people who don’t do much with computers, or who use computers for recreation only. Personally, I spend my waking life mainly using one of several computers. But I’m frustrated with poor quality and my frustration has directly impacted what I will buy and what services I will use.
The problem is not lack of raw demand, but lack of effective demand. This is because we are tired of fighting for it, and we don’t trust it when it’s offered. Claims about quality are often just hype. Why are they hype? Because vendors simply don’t know how to provide a highly reliable computing experience. That’s why I say it’s dead. It’s not undervalued, it’s dead. Vendors are to the point of faking it; going through the motions.]
Certainly it has value to you (and, incidentally, to me; I’ve been in QA for 10+ years), but what about everyone else? Does no one want software that “pleases” end users (avoiding entirely the question of providing “joy and salvation”)?
The software industry is still young. Those of us embedded in it may wish for it to mature more quickly but, like steam engine or automobile technology, it will move at the fastest pace the larger public can sustain. Early technology IS low quality (with occasional notable exceptions that become justifiably legendary). This is a cultural question, not a technological one.
I agree that low quality, even horrifyingly low quality, is the current norm (again, under a definition of quality which we probably share). But the industry will not change under its own impetus; there’s no motivating dollar for doing so. Those dollars have to come from outside, and they won’t. Not until we accumulate enough cultural experience to know that something better is attainable and, eventually, available. This will only happen after we see enough flagrant waste of time, money and, sadly, lives, to drive awareness of alternatives.
[James’ Reply: If you look at history, substantial improvements of complex systems tend to come about only in the aftermath of disaster.]
kL says
Your desktop says you need to switch from Windows to anything else!
• On OS X most applications are self-contained and don’t need to be installed.
• Most running processes are easy to figure out (partly because 8+3 naming scheme isn’t rooted into the OS).
• There’s hard push by Apple and users for applications to respect UI guidelines. *All* apps have same name and keyboard shortcut for Preferences window. Can you imagine?
• Moving to new machine is super-easy. All stuff you care about is contained in your home directory. Applications can be simply copied to new machine without breaking (except damn Adobe’s apps with DRM). Migration assistant does all of that for you during install of new OS.
Yoe says
I just finished a terrible job. I’m in my late 30s and all my coworkers were in their mid-20s. They seemed to hate the fact that I was there, writing good quality code with comments, properly named variables, good program structure, and all the practices that I was taught when I was *18*.
It is hard to fathom what has gone wrong with young CS majors. Why do they write such awful code? Is it because they started with C++? Or is it just some collective character flaw in their generation? My generation is not perfect by any means, we were too greedy, because we were told to be greedy by Reagan and Wall St. But still, the lack of discipline of the new programmers is pathetic, horrible, unconscionable.
Chris says
Honestly, and I hope you’ll read my whole message and hear me out, and I’m not just a fanboy, but you probably should try a Mac. Here’s how some of the things on your list are fixed on a Mac:
Software installation is mysterious and fragile: Mac programs are simply directories ending with “.app” that appear as apps, but all data files are actually inside that directory. Programs can install data in other places, but it’s minute and predictable.
Old data and old bits of applications choke my system: Also fixed by the above.
My task manager is littered with mysterious processes: The Mac runs Unix inside but only shows you valid “Mac OS” processes – nothing mysterious: running Word and Firefox? Then you’ll just see Word and Firefox. If it’s a system process, it’ll be hidden, unlike Windows.
I see a dozen or two “Is it okay to run this process?: The Mac doesn’t ask Vista-esque questions like that. Your user account can’t damage the system (it’s a part of a Unix system user account, you don’t have access to system files) so they never need to ask if something is safe. It just runs, because it is safe.
I click “I Agree” to EULAs that I rarely read: I have to agree with you here but this is just something Software creators do. Switch to Linux if it’s a really big legal worry. I’ve yet to hear of a case where this really changed somebody’s life.
Peripherals with proprietary drivers and formats don’t play well with each other: The Mac comes with the drivers you need. If it doesn’t work with a Mac, it’ll say so on the box. If it does, you just plug it in. You don’t have to install drivers with almost anything: printers, cameras, etc. It is possible to have to install drivers but it’s usually for very specific reasons, e.g. you’re setting up a music studio and need a USB sound card with 12 XLR inputs or something you’d never need.
Upgrading to a new computer is now a task comparable with uprooting and moving to a new city: When you first boot a new Mac or install Mac OS, it asks you if you have an old Mac and migrates everything automatically. No kidding!
Hope that helps. I used Windows for a decade and administer Linux servers for work, but use Macs at home. If you just want things to work, get a Mac. And don’t listen to other people who say they’re overpriced: Macs tend to be “decked out” with features, so configure a Dell system with the same hardware and watch that “Mac tax” suddenly vanish.
Ignobilitor says
Quote: [James’ Reply: I have a Mac, Windows, and Linux. I have the same categories of problems with all of them.]
I’d be interested to hear some details of the problems you have with the Mac, because what you describe doesn’t sound like my experience with OS X and most of the applications I rely on everyday. As with another commenter, I run Windows in an emulated environment everyday (I need to use an analytics app that’s available only on Windows), and I can definitely attest to encountering the issues you describe within the Windows environment.
[James’ Reply: The usual glitches apply on the Mac– strange hangs and crashes and problems with online applications (I switch between Safari and Firefox to evade them). But there’s a serious problem with the Mac: I do a lot in Microsoft Word and Microsoft Powerpoint, but these applications are not fully compatible between the Mac and PC. I think anyone who thinks they are must not use them very much. Cem Kaner tried to switch to the Mac, a few years ago and retreated for that very reason.
I like the Mac. But I doubt that it can evade the basic dynamic I’m talking about. I rather think something else is happening: most of us have become very numb, and some of us have stopped trying new software.]
This column reminds me of observing the father of one of my childhood friends tinkering with his Heathkit stereo many years ago. It should have been the audiophile’s dream, based on the specs, but somehow all of the myriad parts never added up to the promise. Had he broken down and shelled out the dough to get what at that time was the cream of the crop (probably some Bang & Olufson system), he’d have experienced the quality he’d been seeking.
[James’ Reply: I have spent a great deal of money on technology– several hundred thousand dollars in the last ten years. I once equipped and ran my own test lab. Today I own several laptops, including a MacAir, a netbook running Vista, a netbook running XP, and two Toughbooks (both running XP). I ran a Linux server for some years, now abandoned… I also use a Blackberry, an iTouch, a Kindle I, a Kindle II, and an iPod Nano.]
Nicolas says
I agree with some points you make, but not with your premise. As computers become more and more mainstream, quality is sometimes too much sacrificed (cough .. iTunes on windows… cough). However, I think that overall software is much better than it was ten years ago, although it might not feel the same.
Think about it, ten years ago most of us were using windows 9x that was plaged with BSODs. Now when was the last time your computer froze? When was the last time a program just simply crashed on you? Yes it still happens, but at a rate not even comparable to ten years ago. A few weeks ago, while I was in the middle of a normal browsing session, with about 6 or 7 tabs open, Firefox crashed on me. The main window disapeared and a small dialog box opened to inform me that Firefox unexpectedly crashed. I was presented with the choice to restart firefox, which I chose, and it immediatly reopened with all my tabs as they were before the crash. So basically, I lost about 5 seconds and did one click when this application crashed. Is this lower quality than before? Try installing an old version of netscape to see, then tell me if quality for browsers is dead. Even IE is way better than ten years ago.
Ten years ago if I hit alt-tab by accident in a game, I was done for. Had to wait a few minutes before returning to the game because the computer swapped all the memory, and most of the times the game ended up crashing. Now you can run a lot of games in windowed mode(fullscreen or not) with virtually no penalty for alt-tabbing.
I believe its just our expectations about software that has changed. Ok and maybe there are more crappy softwares around today, but why would you use these? There is almost no need for them (there are many alternatives to adobe acrobat). yeah, maybe iTunes is neccessary for the iPhone…. but my point is that if you would try to install a ten year old OS with only ten year old apps, you’d probably suffer much more than with todays software.
[James’ Reply: It’s difficult to judge these things. I wrote what I wrote because I have come to a personal watershed moment, after many years as an enthusiastic technologist.
I agree that the number of crashes has not increased substantially in recent years. What has changed profoundly is the causes and implications of those crashes. My applications and desktop are much more interdependent than they were ten years ago. There are many more intrusive resident processes (many with security implications). My computer is certainly slower than it used to be (Infoworld did an article on that phenomenon) and probably most importantly, distributed component-based development has distributed responsibility for crashes such that it’s much more difficult to report, diagnose, and fix serious problems. Programmers and tech support people shrug and say “Didja try rebooting? Didja do a virus check? We don’t know what’s going on!”.
In my personal computing, I once knew how to live with minimal exposure to data loss and inconvenience. Now I’m trapped, as under a desert sun without shade, and I’m burning.]
Kraln says
A lot of your gripes relate to two fundamental concepts. The first, it’s harder to write good code than it is to write bad code. The second, good code turns into bad code over time.
[James’ Reply: I agree with the concepts, but that has not much to do with my hypothesis. My hypothesis has two parts: entropy and conditioning. Technological entropy has grown to a point where no one knows how to create reliable software that pleases the users it ought to please, and users have been conditioned to accept this.]
Back before instant distribution and painless patching (I’m looking at you, internets), companies generally had to get it right the first time or completely lose out. When you aren’t penalized for making your end users your QA department, there’s no reason to retain one. Just set up a forum and as the bug reports roll in, issue patches.
I must say, I have experienced in general a higher quality experience with OSX, but no system is perfect (Even Amiga and BeOS have downsides)
Penguin Pete says
Oh, will you people just learn Linux already? Get really smart, read a pile, know what you’re doing with it, and Linux will make you a millionaire, cure your cancer, and get you dates too.
I’m tired of being subtle about it. It’s pretty obvious that everyone knows that Linux is the Second Coming of computing, they’re just too stubborn to admit it.
Peter Bierman says
[The usual glitches apply on the Mac– strange hangs and crashes]
I’m biased; I was an OS Software Engineer at Apple for more than 10 years until I recently left to start my own business. And from that perspective, I can tell you that on any platform, \strange hangs and crashes\ have an underlying cause.
I don’t know their cause on Windows, though the conventional wisdom seems to be poorly written software multiplied by the vast quantity of software added to a typical Windows desktop.
But on the Mac, \strange hangs and crashes\ are much less common. As someone who has personally worked on literally thousands of Macs, and browsed the crash logs from millions of customers, I can tell you with authority that \strange behavior\ in Mac OS X is caused by defective hardware, usually bad RAM. Apple is in a unique position of having software that is tested to much higher quality standards than the rest of the industry. If Apple could more proactively identify bad RAM in customer desktops, I think they would quickly crush the conventional wisdom that Macs have the same \usual glitches\ as Windows desktops. Sadly, it only takes a small percentage of customers with this very mundane defect to color everyone’s opinion. And users that upgrade their own RAM are more likely to spread the tales of their unstable systems far and wide.
You can dismiss my thesis as fanboyism, but I think you’ll find it fits the facts better than saying \everything sucks, no one is trying.\ Some companies are trying. I’m not surprised the few successes are overwhelmed by failures on the Windows platform. But the tone of comments here supports my assertion that most Mac users experience a night vs day difference in quality. Sadly, it only takes a small bit of systemwide instability to make anyone think technology must all just suck equally.
[James’ Reply: I worked at Apple, as a test manager, from 1987 to 1991. I was there for the great headache of getting System 7.0 out the door. I would not say, at that time, that Apple was employing qualitatively better practices or was more skilled than other companies. While I was there, in fact, the testing culture was being systematically dismantled in a political war with programmers.
I would definitely say that Apple was passionate about usability, back then. Although the usability visionaries were not supported as much as I would have liked at the time, it seemed they eventually won the day.]
MP says
I recently had the fortune (mis-fortune??) of attending a conference on testing and test automation(God-forsaken territory that!!) and I was so depressed at the end of Day 2 that I lost interest in most of the stuff that the presenters were talking about.
There was no discussion of what motivates a software constructor to do the right thing, influencing the idea of quality.
There was no discussion on the ethics of software and the contract between the user and the creator of the software.
There was no discussion on why every software development team blindly accepts that they will repeat their mistakes over and over again.
There was no discussion on why we should not believe in buzz words and instead look back on the history of our field and how it reflects the state of our world today.
There was no discussion on how awful it is for my mother (or any mother) to even create a social networking account.
I sadly but completely agree with your assesment.
Tyler says
https://twitter.com/rands/statuses/1294696649
[James’ Reply: Well, this is kind of a “quality is dead” example. The commenter here has supplied a link to a tweet that consists of a message only two characters more in the length than the URL. This is technological fetishry without purpose. Twitter, in this case, is not facilitating communication, but rather acting as a barrier to it. BTW, you don’t have to follow his link. The message is “Quality isn’t dead, you’re just running Windows.”]
Hank says
I had my heart attack out of sheer agony and frustration by having to work with software and hardware that stumbled from one inane problem to the next. Sadly, I wish that was hyperbole. I can honestly say that Microsoft almost killed me (it may yet). For me it has reached the point that an interrogation at Guantanamo Bay Naval Station would be a vacation. This too is not hyperbole (although I wouldn’t actually look forward to it). We get all these tools to become more productive but all they do is mire us in a swamp of incompatibilities, patches that need to be installed, tediously having to find out where some feature’s controls are hiding, on and on and on and on.
My personal sadville aside, I wholeheartedly concur with the article. The people in the field care deeply about their product and want bugs that have been in the software, sometimes even for years, to be resolved. I see the same layers of management who want ever greater efficiencies and statistics. Yes, statistics. We have to have metrics. That the customer is frustrated by stupid problems they should not have to deal with is immaterial. We have to have metrics though. It has to be poured into numerical data. Glee.
I don’t think I want to stay in this field much longer. Although it has fascinating applications and I have met some of the smartest people I know in the place I work at, the incessant nagging about costs is beyond annoyance (by which I do not mean to say that looking at costs is not important. I only wonder why, if they are so important, the entire marketing department gets to go skiing in Europe on the company’s dime. It’s probably me not understanding how financing works.}
Test work is sorely underestimated as a profession and it is being looked down on. “Everybody can do it” is the refrain of those who think the only hard job is theirs.
Seriously, I think I have a good plan here. Do something that I enjoy doing and reduce my private consumption of software to a few pieces of good software. There are quality products out there and I admire the people who produce it. The rest of the world may continue to suffer trying to manage their bug-ridden junk with a menagerie of service packs, patches, upgrades and a whole lot of swearing.
Enjoying my life is so much more important than clicking, yet again, on that stupid User Account Control button. And 7 is going to hit the market soon. Bah. If the software vendors don’t care about their product (beyond quarterly revenue) why should I?
To all software testers out there, from someone who’s been in the trenches himself: all my respect and appreciation.
[James’ Reply: You touched on something I want to write about: minimalist computing. Restricting myself to a few apps. There’s a lot to be said for that.]
Mark Hughes says
Your experience on the Mac is two decades out of date.
[James’ Reply: Actually it isn’t. I own a MacAir, I owned a couple of iMacs. My experience working at Apple is 20 years old, but out of date? Probably not by much. I’ve been continuously in the industry. The industry hasn’t changed a whole lot.]
Mac System 7.0 was almost 20 years ago, under John “Pepsi” Sculley, and is not directly related to the current Mac OS X, which is NeXTstep, created and managed by Steve Jobs from 1985 to present. Steve Jobs is not noted for being tolerant of mistakes or sloppiness, and it shows in Mac OS X.
Comparing Mac OS X to your litany of Windows desktop problems:
* Software installation is generally insanely easy: Download a dmg or zip, extract it, drop it in Applications. A few apps have system installers. A few bad apples like Adobe copy the shitty Windows experience over to Mac, and get excorciated for it.
* Uninstallation is just as easy: drop the app in Trash. If you’re OCD, remove its support files from ~/Library/Application Support.
* You don’t need to, nor SHOULD you, remove any of the system daemons, but there are man pages for all of them.
* Startup is minutes faster than Windows, and gets faster with new OS releases.
* There’s no “mystery glitches” on the Mac. Activity Monitor will show you what’s eating CPU or memory, but nothing weird or unexpected. The only time OS X has ever “snow crashed” or such is when I was writing bad OpenGL code.
[James’ Reply: That’s not my experience with the Mac, bro. I have to go to the “force quit” routine about once a day. Then again, I’m not using very many apps on the Mac. I don’t do as many things with the Mac as I do with the PC. So perhaps I’m just stuck on bad apps. Maybe Netflix is the problem?]
* First time you run a downloaded, unsigned app, you have to authorize it, and can view the location it came from. Otherwise, you’re never bothered.
* EULAs are your own fault. If you’re dumb enough to agree to a contract you didn’t read, you deserve the ass-raping you’ll get. No OS is going to protect you from being an idiot. What do you expect, that it’ll read it to you and quiz you on the contents? Learn some personal responsibility.
[James’ Reply: Are you familiar with the UCITA saga and the controversy surrounding click-wrap licenses, or are you just being abusive to me because your mom won’t let you say “ass-raping” in the house?]
* Drivers don’t conflict. Apple ships integrated hardware and software, and you use system-approved ways to talk to external devices. Putting shitty hardware and software from 3rd parties in a cheapest-possible-parts system with no baseline “safe” spec is an obvious recipe for disaster. I used to be amazed that Windows worked on shit Dell machines at all, but now I see that it doesn’t.
* When you buy a new Mac, connect it to the old one with a Firewire cable, and OS X copies the old files onto the new system. I was up and running with my new MacBook Pro from my old one in < 2 hours, with no interaction.
* A consistent UI and HIG (even as disparate from that baseline as some apps are now getting) makes using new software trivial.
* You seem to be complaining about businesses, and how hard life is, not your desktop. An OS can’t help you there.
[James’ Reply: I’m talking about computing. It’s something I live and breathe every day for hours a day. Apparently you do, too, but miraculously don’t run into problems.
Perhaps that’s because you are a technologist, and not a normal person? I’m really not surprised that some techno-fanboys think that the state of computing today is golly-gosh keen. It’s the boiling frog thing.]
In software development, the last 10 years have seen the rise of Test-Driven Development, JUnit, and Extreme Programming, which treat producing zero-defect software as the STANDARD, not some unattainable fantasy. The languages in common use on good platforms have gone from primitive C and the abominable C++, to stable, managed environments like Java, Python, Objective-C.
You’re not even informed enough to be wrong, it’s like you’re looking at a septic tank and proclaiming art dead.
[James’ Reply: No, you don’t produce “zero-defect” software. That is a fantasy. Work with a tester, once in a while. He’ll keep you grounded.
I’ve seen a lot of things, man. I’ve been all over the world, at a hundred or two companies, talking about testing and seeing how software made. I’m aware of the Agile hype. The reality isn’t quite as you portray it.]
Lally says
Yes, most software does suck.
But, do not despair. You actually use only a small fraction of what’s out there, and you can choose most of that.
A few points:
1. Software quality is a fairly new concern. Software only (relatively) recently started getting so big that it couldn’t fit into people’s heads anymore. There are movements now to take control of it. Eventually the same sort of stuff we take for granted (encapsulation, abstraction, etc.) will also include testability, fault tolerance, and actual quality.
2. You may have to live a little like a monk, but you can surround yourself with good software. It’s kinda zen, really, and makes life a lot easier. It’s what I’ve done and I’m 10,000% happier for it (I measured :->).
– First rule: buy hardware from the vendor that makes your OS. It’s the toughest, but not terrible. Mostly this leaves Sun and Apple. I’ve got one of both (laptop & desktop, guess which one’s which :->) and that \one number to call\ security is actually pretty good. OpenSolaris 2008.11 is pretty good. Donno what’s causing your force quits on the mac so often, I get about 1 every 4 months. I suspect it is Netflix streaming.
– Second rule: For everything else, you have virtualization.
– Third rule: trust old stuff first. Tar for backup, perl, python, or bourne for scripting, text files for most data. Emacs, bbedit, or vim for editing. Only when you get a real advantage (e.g. zfs snapshots) do you get fancy.
– Fourth rule: vendors matter. Some vendors just do terrible jobs (was that Netflix streaming, with the Silverlight plugin?), some do mediocre jobs (apple), and some are pretty good (gnu, sun). Nobody’s great, even nasa crashes a lander now and then. But, the higher percentage of your apps coming from the better vendors, the more reliable your systems will be.
3. Remember to apply your expectations for other systems’ quality to your own code. It’s an eye opener! Also, it helps you learn to build reliable configurations of unreliable parts. Which, so far, has been the most reliable setup I’ve seen.
Cheers brother,
-Lally
[James’ Reply: Some seriously good ideas here. It steals my thunder a little bit, since I was going to advocate some of this in my next post. Thanks, man.]
Jonathan Starr says
You get some quality from testing after development takes place. But a much better way is through TDD (which I suggest that you look into.)
This is the ONLY way to have true unit tests around all code and to implement the leanest possible code for a particular solution.
Remember:
(White Box Testing) > (Black Box Testing)
(The cost of a bug that is passed to a tester) > (The cost of a bug fixed before going to a tester).
[James’ Reply: Yes, I’ve read about TDD. I’ve done TDD. TDD is interesting. The only people who say TDD is better than testing are people who do not study testing. It’s nice that programmers want to produce better code. TDD helps with that. But white box testing is not necessarily “greater than” black box testing. It maybe less than. I try to avoid those misleading terms, actually.
Please don’t use the phrase “the only way.” That’s a marketing mentality, not an engineering mentality. If you see me use that phrase, call me on it. Thanks.]
Antony Gorman says
Couldn’t agree more James.
And whilst I agree that the rise and rise of TDD is an enormously good thing for development I also agree that it’s a very far fetched to state that it produces zero defect software.
All it can ever do is confirm that the software works in the limited number of scenarios that the developer thought of up front before writing the code to pass the tests (the two things are so very far apart that I’m amazed someone talks about zero defects without the slightest hint of irony).
[James’ Reply: You know, I used to be a zero-defects advocate. My excuse is that I was 21 years old. At that time I was also an Objectivist. I’ve learned a lot about complexity and human behavior since then.
Anyone who implies that the answers to our problems are simple strikes me as a child, or as an adult intent on treating people like children.]
David says
@Nicholas – ‘Now when was the last time your computer froze?”
Last night. It’s a near-daily occurrence, that hasn’t changed. Removing the mechanical “hard reset” button from the front of the computer case is the dumbest move of the last ten years. These days I have to reach behind and flip the power supply switch.
You say you don’t get the BSODS and systems dying. I’m really glad for you, but I haven’t noticed much change. I still get them. Systems just refusing to respond are NORMAL and this has been the case without change since the IBM PC was invented. I find that I still need the electronic equivalent of “hit it over the head with a hammer” on a regular, near to daily, basis.
Ivor says
Well here we go, another degeneration into an irrelevancy about which is better, this or that, old versus new etc. etc. etc.
The fundamental issue is bigger than testing, technology, individual experience or even one person. The quality of the products and services we get today are a function of a wider malaise within the world we live in. The minutiae surrounding the individual merits of this versus that, are irrelevant as the whole encompassing view is one of poor, pi#$ poor, awful crap that we accept on a regular basis.
I have worked in the business of software for nearly 20 years. In that time I have had the kind of experiences that would make unbelievable reading to those outside of our industry. I have done some things that have left a sour taste in my mouth and in other cases taken a principled stand. Without a doubt, the principle of Quality is the one most abused in whatever context you can place yourself. Quality within our industry is subjective, a measure that we take and define whatever way we want it to be, so that we can do what has to be done. I have only recently in my career taken the point of view that the only measure of Quality is what the customer will accept. I work to inform them, support their decision making process and help them understand the limitations that we and they labour under.
At the end of the day, the customer will make their choice based on the minutiae people are talking about here. That and cost.
Quality is not dead, we can rebuild it, we have the will 🙂
@Mark Hughes, if you truly believe that there is some fast approaching Utopian horizon that yields the promise of Zero Defect software development, you are either a relative newcomer to this industry, a supremely arrogant developer or a fantasist. How many years has that mantra been taken out, dusted off and prostituted on the conference circuit? Answers on a postcard please! 🙂
[James’ Reply: I sure appreciate your passion, Ivor.]
Michael Bolton says
It is hard to fathom what has gone wrong with young CS majors. Why do they write such awful code? Is it because they started with C++? Or is it just some collective character flaw in their generation?
They were taught to code by teachers of our generation, and they grew up with the products that our generation developed. Entropy accelerates.
Tim Coulter says
@David,
Ah, yes, you’re correct: I presented (and understood) Melkjug as the wrong piece of technology. That said, I appreciate your feedback and sent it on to the developers.
PS: I went through the same process you did, remembered your post and was floored at the amount of discussion made in my absence. So ya, I feel your pain.
William says
Wow, what a load of BS!!! Quality is dead? Really? You rant about software being of poor quality and I’m sure I could add a few bugs to your list; why does the little mute light/button on my HP not stay in sync with the little speaker icon in my task tray. It’s probably because they didn’t test VISTA against a laptop that was released AFTER they released Vista. Is this a quality issue? Should HP or Microsoft have hired some testers from the future to test this combination?
Most software bugs can be categorized into three categories; software/hardware or software/software incompatibility, user error, or stupid programmer error. The number of permutations of the first category would require years to release any new piece of software and STILL wouldn’t prove that your software would never break. The second category of user errors includes everything from not reading the prerequisites of the software, to doing something with the software that it was not designed to do. Most BUGS are from this category, software is getting better and better in not allowing user to shoot themselves in the foot, (when was the last time you heard of someone accidentally deleting their windows directory?) but you can’t always protect stupid. It’s the third category that always get the blame and in many cases it may be right to blame them, but… Was the Y2K issue a software bug?
Outsourcing is here to stay, so get over it. Companies are outsource because management percieves that their developers are going to write crappy code anyway so they might as well outsource and get crappy code for cheaper. We as developers are responsible for this perception, we have historically not done a good enough job understanding what the customer wants, instead we give them what we would want if we were the customer. We are the worst customers on the planet.
By the time we get to 2.0 of our software it is different enough form the original that we wind up trying to pound the square peg into the round hole. AM tries to solve this problem, stay tuned to see if it successful. Maybe someday software built using one of the AM’s will have a little sticker on the outside say “Build using eXtreme Programming”.
Software quality is getting better as newer methods of development reduce the number of bugs that make it to QA. The downside of the “developers will test their own code” movement is that testing departments are getting slashed or off-shored. In this point you are right, but how many pieces of software do you have on your system that are pre-1.0 or beta software? Beta=you are the QA department.
We as software users must demand that companies push the quality slider to the right.
[James’ Reply: This comment is kind of all over the place. But I’m sure of one thing, you’ve offered no data. You say software is getting better. I would say that is not generally the case. I would also say quality is out of the hands of any one vendor, because of the flawed tools and frameworks they are using, the OS’s, and the heterogeneous computing environment.
It bothers me that I also have little hard data beyond my own daily experience, except the Infoworld test that I have cited. I wish I had access to corporate bug databases. I believe I would find a vast increase in the incidence of intermittent bugs or bugs otherwise deemed unfixable.]
Peter Bierman says
In 1991, with System 7, I would not say that Apple’s software had qualitatively better quality. In 2009, with Mac OS X, I say that it does. Partially, this is because Apple has maintained a reasonable QA process while competitors have neutered theirs, and partially because the unix model of independent processes with protected memory has made automated crash logging much easier than it was in Classic Mac OS, making QA more effective.
I am not claiming Apple’s software has no bugs. I am claiming that the majority of “mysterious” crashes and problems experienced on Mac OS X are caused by defective RAM. And users are accustomed to crappy software, so they just assume the problem is unfixable. Many many Mac users have a trouble-free experience. If your Mac experience isn’t trouble free, don’t just bitch, get the problem fixed. Specifically, if you upgraded your own RAM, pull out that module and run with just the stock RAM for a few days and see if your problem vanishes.
BTW- I agree with your core thesis here. But the solution will not come from the vendors. It will come from users voting with their wallet. That’s one of the reasons I wish Apple would take every source of instability seriously, because customers can not distinguish the difference between a product with a 1% failure rate from one with a 0.001% rate. If it fails for them, or someone whose opinion they trust, it might as well be 100%. To differentiate by quality in the marketplace, it’s not good enough to be better, the product has to be so close to perfect that people assume any defect will be remedied.
Incidentally, that’s the model customers expect for the iPhone. If they have apps randomly hanging on their iPhone, they walk into an Apple store and get it replaced. That’s a powerful financial incentive for Apple to maintain quality. And it’s worth noting that the iPhone is running the same Mac OS X as the desktop. The tools Apple uses to maintain software quality on the iPhone were developed first for the desktop.
[James’ Reply: I have had some trouble with my iTouch and iTunes, though overall I am happy with its reliability. I am not happy with the draconian digital rights management which caused it to dump several hundred of my legally owned songs when I dared to sync it with a new computer.
iPhone had some important problems, though, when it was released. Don’t you remember? It was all over the news.]
Chris Sterling says
Good day James,
I agree that many companies and teams have given up on striving for quality. The problem that I have is the reason why. It takes more (in $$ and time) to deliver a quality product. I don’t agree with this premise. A team of experienced developers with an understanding and lean towards making quality software will do so in less time than any team of cheap developers or outsourced team that has longer feedback cycles with the customer. Agile teams can get better time to market and with integrity built in to a sufficient level.
Lets teach more people to create quality software and more will be demonstrated in the marketplace. Stories will find their way out about the shortened time to market for these products and there successful subsequent releases. This will NOT solve the software industry issue around quality but it will create more quality in our profession and maybe little by little it will continue to grow. But it is a long road and there will always be less experienced and cheap labor around that will jump to create crappy products and we should be able to acknowledge their results, successful or not.
[James’ Reply: Chris, I think you may have missed my point. Let me say it more bluntly: quality is impossible.
Now that’s a simplification. Quality actually IS possible, in some or other context. We can play games with defining quality loosely or restricting use of products. But what I’m trying to say is that it doesn’t matter how good your developers are, if you release a product “into the wild” it will be glitchy and troublesome NO MATTER WHAT YOU DO because of THINGS BEYOND ANYONE’S CONTROL OR KNOWLEDGE.
The entropy of natural operating environments, with many accidentally interacting components, dictates that this is so. And the result is that programmers shrug and users grin and bear the pain.]
Troy Taillefer says
Well I think your looking at quality too one dimensionally take for examples all those processes that start up on windows yes they consume a ton of resources but imagine your techno phobe needs to start one of them that isn’t started yet, how is he going to figure t out how to do that? Windows starts everything because it figures that most users won’t be able to figure out how to start a process if they need it. There are other operating systems that are for more tech savvy people that make different trade off’s. Also quality is just one pillar, cost, schedule and scope at least in the short term and in the small they are tradeoffs. In the large and long term quality will impact all the other pillars.
[James’ Reply: This doesn’t affect my hypothesis that quality is dead.]
Mac versus windows is very interesting debate here is my take. An OS’s purpose is to run software. Windows runs more software than any other OS so in my books it is the best OS because it does it job better than any other.
[James’ Reply: It’s the best because it’s the best because it’s the most?]
There are software projects that have failed striving to hard for quality and there are projects that have imploded due to lack of quality. Good enough software is hard middle ground to find.
Also if we were satisfied with simpler software we would have a lot more quality software but we want flashy easy to use guis, software that understands all the complicated tax laws out there etc etc. If you look at the best software you will find it to be a simple only doing one thing well. It is easy to get one thing right, now compare that to software we write with hundreds or thousands of requirements/features. Trying to be all things for all people just never works that well and probably never will.
[James’ Reply: I agree. That’s a big reason why quality died.]
As a person who writes software for a living the quality mantra gets old, people act as if we write bad software on purpose or that other fields do such a better job then we do when it comes to quality (this is total bull). Family, friends and I have been ripped off by enough by poor trades man and contractors to know better. Plenty cars have been recalled, bridges have collapse, planes have fallen out of the sky, space shuttles have blown up, toys with lead paint, every industry has quality issues and has had fiascoes.
Also since you’re a tester I would expect you to have a rather bleak view on software quality try writing some software and you will notice that it is harder than it looks. Actually try any other profession then you own and you will notice no matter how simple it looked it is harder then you thought it be.
[James’ Reply: Yeah, I was a developer before I became a tester. I know it’s hard. Did you think I was calling it easy? My hypothesis is that it is TOO hard to write software, these days.]
Now I will agree with that Testing is extremely important I agree with Fred Brooks that testing should consume 50 % of the development cost and schedule in some form or another versus coding which should only consume 25 % effort. If your following a more archaic process it means you should have 2 testers for every 1 developer, I have never at the places I worked. In an agile setting this might come in writing as much or more test automation code then production code.
NASA has 4 testers for every 1 developer but most companies can’t or won’t spend the money to achieve this kind of quality so if your unhappy with software quality then don’t buy it obviously despite your complaints you feel better off with this crap software then without otherwise you would not buy it. Or you can write your own custom software and make it is high quality as you are able to.
Anyways interesting post.
Troy
[James’ Reply: Thank you. My post was interesting. I hope you read it, sometime. From what you wrote, it seems you only read the title.]
Michael M. Butler says
Emblematic anecdote:
Today I awoke late (better rested than usual, but awake).
I use a little alarm-app to play a 15-second mp3 of a gong, repeatedly, No such gong happened. Well, maybe I turned the sound down before I went to bed. Go and look.
My XP system has BSOD’d (which is quite surprising to me). I reboot. Eventually, I find that XP wants to file a crash report, which I OK. Then I get a balloon help popup. It seems that my system has just accepted a mandatory update from Microsoft.
BSOD on completion of system software patch? Hmmmmmmmmaybeeee….
Wow. I lack the time or inclination to figure it out. I’m left with the faint free-floating hope that a BSOD won’t happen again soon.
But The BSOD did include the helpful suggestion that I might want to remove any software *I* had recently installed. Not one clue that “it” (the computer-Microsoft collective) had done so.
Marvelous.
And incidentally, on reboot, my computer lately sets its sound level to zero. Not sure how long that has been going on, but I haven’t figured it out, either. How would I? So even if I had gotten a full reboot–I wouldn’t have heard the alarm go off.
Froth at mouth, rinse, repeat.
It’s this death by a thousand cuts that saps one’s strength.
Troy Taillefer says
[James’ Reply: Thank you. My post was interesting. I hope you read it, sometime. From what you wrote, it seems you only read the title.]
Sorry for failing to contribute anything meaningful and wasting your time.
[James’ Reply: Okay, but I was serious. Read what I said was my hypothesis. Then attack that if you want.]
JerseyGuy says
Troy’s response was right on.
‘Quality’, in so much that it can be measured, costs. If software companies were to apply the same level of ‘quality’ to their software as is applied to say, the software on the mars rovers, it’d be prohibitively expensive for consumers. There’s a cost/benefit trade off, and customers are an equal part of that equation. To the average customer, a few blue screens every once and a while is a good trade off for paying $200 instead of $20,000 for an OS.
David says
I utterly disagree, JerseyGuy. There is no ‘cost/benefit trade off”. This is a completely dysfunctional market, and even if I’m fully willing and happy to pay more for quality, it isn’t available to me at any price. Customers are in NO sense any part of the equation.
If a developer did create a reasonably solid operating system, he couldn’t reasonably anticipate selling it in enough quantity to be profitable. To the extent that it wasn’t compatible with the Windows monopoly, the market would reject it just as it has rejected Macintosh and *nix. To the extent that it was compatible, MS would sue it out of existence or buy it and bury it.
Even then, there would be no applications for it. Mac users should be very familiar with this one. Why should I write an application for MacOS when I can write it for Windows and sell ten times the number of copies?
Do not pretend that it’s the market’s fault when the market has never been allowed to function. There are those who WOULD pay $20,000 for a bulletproof OS, if they had that option. There are many who would pay a lot more for something that worked better than this. It’s not available. There are no price points and no options on this graph.
Robert P says
I think one place that’s really starting to get it is, unlikely enough of all, Microsoft. Maybe not all groups, no, but their core OS folks get it. On the Engineering Windows 7 blog, they talk about their teams: usually 1 program manager per five developers and five testers. It shows with the core of Windows 7 in their beta: it’s a rock solid product. The outlying programs (Windows media Center, some of the other add-in programs) don’t seem to have this level of service. Windows has been one of the biggest eyesores in the PC market for years, but its almost exclusively made itself better over the years. (Regarding apps and drivers written for Windows, that’s a whole different matter… 😉 )
There are some industries out there where ‘it has to work right’ and there’s no way around it. Mission critical embedded systems are a good example…well, most of the time. If the flight system in an F22 goes out, say, because they crossed a particular time zone at a particular time, no one will put up with that, ever. Inquiries, jail terms, and in the worst case deaths appear when people’s lives are on the line and the quality isn’t there.
Here’s the kicker that I’m sure you understand but very few realize: quality in a product doesn’t increase it’s long term cost, it /reduces/ it. The less money you have to pay for support, the less rewriting you have to do, the fewer dependencies you rely on, the less time you spend tracking down issues from year old bugs, the simpler and the stronger your codebase is, the cheaper it gets. Problem is, it takes brains, guts, and time in the short term to move in that direction. Unfortunately, the easy way out is always present.
Troy Taillefer says
I thought what I had to say was relevant but ultimately I am not the judge of what you feel is relvant.
As long as there are people that care about quality, quality can’t and won’t die completely.
Quality is doing the best job you can in the situation your in. I care about the code I write so it is of higher quality
then if I did not care, I actively study my practices and the practices of others to improve myself.
To me Quality is an attitude, if everyone said well I can’t make this perfect so I am not going to do it all there would be
no value at all created in the world for anyone. The fact that software is valuable despite all the bugs and cost of producing it is self evident.
In general my belief it is at least twice as hard to verify a product then to build it. I have never worked in an environment that recognized this in a put your money where your mouth is fashion. I have joined the test infected crowd and write unit tests for most of my code, I have come to apreciate the difficulty of writing good tests it is a skill set I need to improve upon. Hence why I know who you are since I am considering a career move into testing. Nice talk on becoming a software testing expert by the way loved it.
I never bitch about the quality of products that I have the choice to avoid I think this is pointless I believe in caveat empor buyer beware. I can rant about some of the horrible pieces of software that I am forced to use at work and how they turn something I love to do (writing code) into an utter nightmare sometimes. In defense of these horrible tools (like Rhapsody and Clearcase) someone somewhere made a free market decision to buy these tools they perceive some value in them there perceptions remains a mystery to me and my co workers. Value and quality is in the eye of the beholder and the fact that people pay lots of money for something is the most objective measure of value and quality I can think off. Certaintly commercial success is a better indicator of quality then some quality assurance person giving
their blessing.
I was hoping to contribute to this blog because I felt I had a lot to say. If I have gone off on an unwelcome tangent I apologize again.
Troy
Sandeep Maher says
I cannot help but draw a parallel about what you mention about falling levels of quality (or rather its death in your view) with the life we live today and its many ‘unsavoury’ accompaniments versus say 2-3 decades back when our forefathers were the flag bearers of a relatively less complex life veering towards simplicity and order.
Do we not sense and realise that our lives are today more complicated, stinkier by the day, falling in value ans substance, cursed and step-by-step rotting away from the (uncontaminated) life that we knew existed say back in the 60s/before. Are we not now witness to scenes of crime/violence/hatred which becomes bloodier and more horrific than the previous one? Vietnam, 9/11, Mumbai terror attack, Winnenden school shooting, & so on…
How do we counter this in our own individual microcosmic way? Do we give up on teaching good values to our children? Do we not stop the sibling fights and not abhor violence of any kind? In a bus, when I see an elderly woman struggling to keep her balance, do I not offer her a seat? When I hear a child speaking profanities do I not admonish even if she is not a blood relation?
Of course I do.
Quality of life is in our hands to a great extent and we must continue to encourage, inspire, instil and ensure that we influence its goodness to the extent possible.
Likewise when a product we are not happy about is readied for release would we not raise a RED flag?
Of course we will.
We will be vehement in giving this information to the powers-that-be supported with defects information, tests undergone/not undergone, plan ahead for testing and so on… When we do this do we not try to protect the user who should better not use the product in its present shape. Do we do this because it is our job? Yes but is that all. I think the voice comes from within & it is equivalent to the scenario of a small boy uttering profanities.
We do our best and we should.
So I do not agree with your “Quality is Dead” completely. As much as the Project/Release Managers will push the buggy products out there would be test professionals (like us) standing in the door wearing and shouting ‘red’. The test professionals may be outnumbered and shooed away but the bugle would have been sounded. Sanity would prevail at times and it would not at others but death – No. We would not let it happen!!!
Rich Sirokman says
I stumbled upon this when I was searching for more info on why the quality of Vista was so bad, which some have attributed to the fact that Microsoft has taken the view that their software quality depends on having it pass automated test scripts. It’s a lot easier for ABC manager to prove that a program has passed XYZ tests than to explain that the software has some sort of subjective quality problem. So, software projects are driving more towards the types of things that can be measured. IMHO quality may be subjective, but purchases are definitely binary. Showing that there is a connection between these things is how to convince companies to produce higher-quality software.
I wanted to comment on the products you mentioned. I haven’t used Tivo, but I am a Dish subscriber and use the VIP 722. I think it’s an awesome machine, however I keep noticing bugs that confuse me and cause me to sometimes miss shows. The more of a power-user I become, the more bugs I see. I was pretty upset last week when the end of a basketball game I was recording got cut off.
I’ve been a Mac user for the past 3 years, and I have little interest in using Windows unless some program requires it. Apple has had some real issues with hardware, although their software works pretty well. I think this idea that the hardware/software combo has been so thoroughly tested has a lot of merit. I did own an Ipod for a while until it crashed. Now I have a Sony Walkman that has a worse interface, but far superior hardware. I happen to be one of few people who is willing to put up with some clunkiness to get better sound quality. I realize I’m making a trade-off. I do still use Itunes because I love how easily it interfaces with podcasts.
In fact I think our experiences with “pretty good” software like OSX has made everyone less patient with a poor user experience. So I don’t think quality is dead–I just think your company is dead if you aren’t figuring out how to make your users happy. It’s a complicated problem, and I hope product developers take the human aspect seriously.
David says
I got to wondering, James, why you don’t simply repudiate Vista? Just uninstall it and reinstall XP, get back in control of your system?
[James’ Reply: 1) My computer came with Vista on it, so I don’t know if the drivers in XP will work with the chipset. 2) I don’t have a copy of XP that I can install, since Microsoft stopped selling it. 3) It’s a netbook with no built-in CD drive and I don’t know if the external CD I bought for it will work during installation of an operating system. 4) It’s an annoying use of my time.]
Elaine Deion says
Wow! And I thought it was just me. Nothing annoys me any more than paying big bucks for software that is buggy or just doesn’t work the way it should. Complain to the company and all you will get is a comment that they \think\ it must be that your computer doesn’t meet the operating requirements of the particular software package that you have purchased. It’s just too bad that you have spent X dollars for nothing.
Simon says
Hi James,
It is interesting that you are mentioning financial industry, but not in the context of financial software development / testing. Well, as someone working in this industry for more than 10 years I can tell you that the management is very much aware of the consequences of software defects for the companies. Your hypothesis clearly does not apply here as the penalties for low quality of software are pretty steep. Yet in current economic conditions the financial companies behave very similar to those who (as you believe) just gave up on quality. They slash the cost by replacing trained, experienced, smart testers with junior offshore staff. I do believe that they behave rationally, but the explanation is different (and simpler) than yours. In hard times the companies just slow down their development efforts, and they simply don’t need the same level of testing staff for their IT departments. In fact they also cut application development staff albeit not that drastically. My explanation at least gives a glimmer of hope to us testers as the economy will recover sooner or later, and we just need to be properly prepared for that time. We need to sharpen our tools and methodologies, and learn to better articulate the value added by software testing. As new application development projects pick up the speed we should be able to deliver competitive advantage to the companies that care about quality, and by the way hire better software testing staff. If we can not deliver then I am afraid software testing will end as profession.
Bruce says
Just curious James, why did you migrate from XP to Vista? I suppose there must be reasonable legit reasons to do so. But I can’t seem to think of any myself? (Perhaps I’m living in a bubble…) But I am curious. I have found XP to be quite manageable as apparently you did as well? (well quite manageable as long as I apply my standard methods, which I would describe as ‘utterly ruthless’) I have worked with Vista when absolutely required. It’s brain dead. (I have so far discovered several innovative features in Vista, well exactly three… 😉
I would like to present the notion that even spectacular software would seem like crap when running on Vista?
Clearly the same thing applies to the affect of all the worthless meaningless processes running on most machines today.
The installation of update programs, app preload programs, endless unwanted and unnecessary system hacks (that usually provide crappy versions of components and such that Windows already provides) and on and on, ad nausia… creating this situation where you MUST use Autoruns and ProcExp just to use your machine! (as you mentioned, in so many words…)
Point being that no amount of quality can overcome an OS that is choking on crap and/or an OS that is choking on it’s own crap.
Thanks…
[James’ Reply: I got an HP Netbook thing that only comes with Vista installed. So I was stuck. The next system I got was XP, but in Swedish, and it turns out it’s not trivial to change the language to English!]
J. Michael Hammond says
I agree: Quality is dying. Or dead already.
I remember a Technology Review article on the topic, maybe five years ago?, that came to the same conclusion for very similar reasons: Given the classic three-way tradeoff (quality, price, time), customers demand poor quality NOW and look for competition on price.
I’ve lived through this problem for fifteen years now. I’ve tried to find companies to work at where quality is forced to matter because the customers demand high quality and look for competition on price and time. It’s been a long and generally unrewarding road. It’s probably safe to say that I’ve turned from a highly productive, ambitious, helpful tester to an angry, frustrated, morally corrupt smartass. I used to give a damn and I miss that. I’m far enough down this path that it’s going to take a big kick to either go another path or figure out how to make the path I’ve chosen bother me less.
Please hurry up and get on with your ideas on the “what to do about it” part! 🙂
–JMike
[James’ Reply: Ah yes. I need to write the positive part!]
Debbie says
I laughed OUT LOUD!! Thank you, for saying what I too have felt. I MOURN my tivo! why couldn’t the others get it right. Why couldn’t I keep my tivo! I had to choose, HD or Tivo, and believe me it was a hard choice. Thanks for letting me know I wasn’t out of line for feeling shafted.
Chuck van der Linden says
I have to admit, I only made it about 2/3 of the way through the comments before I gave up.
I have to love the folks lecturing you about how you should use this or that platform, or saying you are ignorant of his or that magic bullet practice that will solve all the problems.
Surely it’s proof that the quality of their reasoning is dead… without knowing you, your background and experience they jump to the conclusion that the problem is ‘YOU’ and not the software/os/etc. Lovely ‘blame the user’ mentality there, which I think is yet another aspect of the underlying problem. (reminds me of the advert slamming the cable companies. ‘we need better customers, ones who don’t want so much from us’.
I’ll disagree on two points however.
1) security isn’t worse.. it’s far better than it was.. However the landscape is far far worse than it was 15 years ago, by orders of magnitude. So the net result is that we are more likely to experience a problem. 15 years ago you could put a system up ‘naked’ without a firewall on the internet and be just fine. if you put that same system up there now, the ‘time to ownage’ could likely be measured in minutes.. (and I don’t care if it’s a windows system, or a mac or a unix/sun/*nix box, they would ALL get owned in fairly short order if they were running an OS from 15 years ago without any patches since that time)
2) users are stupider.. well maybe that’s not fair, lets say ‘not technologists’ shall we? 15 years ago you had to be computer literate to use a PC of any variety, even mac’s of that era required some level of savvy from the user. The average user in those days was far more skilled than the average user today.. And the system did far less then, there was less to go wrong, fewer places for things to break. Now I’m not sure if that trend towards less literate users has accellerated faster than the improvements in usability and self diagnostics, but one thing is for sure, when something goes wrong, the average user of today is far less equipped to deal with the problem in terms of experience and troubleshooting skills, than the average user of 15 years ago.
we’re also running more programs and a greater variety of hardware, all of which increases the potential for adverse interactions, so the potential for problems has increased (e.g. the pairwise test set would be substantially larger)
there’s other trends in terms of landscape change that have affected us also, such as slimmer margins for computer makers that have forced them to seek additional revenue by preloading new systems with tons of crapware..
but still I think your fundimental point remains true, it seems a lot of companies have either given up on quality, or drunk the cool-aid and fallen hook/line/sinker for some bogus silver bullet (automate all testing, offshore it all, replace testers with unit tests and tdd, etc).. There are anti-patterns to this, such as the agile notion that the entire team is responsible for quality, not just the QA guys at the end of the cycle, and I think largely they are headed in the right direction, but if the upper level management doesn’t give a hoot, and doesn’t fund time and energy for quality, those things are doomed to fail (imho anyway)
–Chuck
MartinG says
Almost everything in your illustration describes problems only typically experienced by windows users.
Michael M. Butler says
MartinG, I’m sure James or I could also come up with a pretty long list of our infelicitous experiences with, say, Android phones, but the list would probably be of things almost all of which are only typically experienced by Android users.
Bill Whitehouse says
I’ve been involved with systems and software for some time. After some personal accounting it seems about 30% of my professional time has been spent trying to make something work or recovering from something that didn’t. Discussions with others come up with agreement.
I recently spent two days to tracking down the reason the partitions on two disks in my home system disappeared. It was a driver that was part of a group of drivers. The partitions reappeared after uninstalling the driver. But not before a lot of frantic time I couldn’t afford was spent looking for other reasons.
Drivers were not suspected initially because of weeks of booting and performance issues. This seemed to be a continuation of the problems. The updates were an attempt to address those problems. I was already blaming everything else. Bad power supply, bad disks, bad memory, a BIOS update done the month before in attempt to resolve those problems, and others.) It’s gotten to the point I have little faith in anything. Anything that presents even the smallest problem or makes a change that causes time be invested to adapt is out, unless there is no choice but to keep it. And I’m trying to avoid getting into that situation.
I love my toys. I enjoy the type of work this world enables. New toys are great. I’m a person others ask about them. But I have been walking away from many without thought.
For a while I was in the mode of doing research. That is just as much a waste of time, even though it can produce good results. Most of the time is a choice of the lessor evil. Even the old reliable companies seem to have gone to the dark side. Reputation is not a solid factor any longer.
I may be looking at the toys I’ll have for a long time. And they may not be updated. I don’t know how precarious the balance is. It’s just not worth it.
Satellite Technicians says
I did satellite installs for 10+ years, (for both Dish and DirecTV) and I met plenty of people who swore by TIVO. Great systems, so I’m in agreement there.
The early Dish & DirecTV DVR systems were indeed just awful. Not so much now though. They are actually good now.
faizal fazlil says
Thank you James, i now understand completely and know to define ‘quality’. Its like getting out from the matrix and seeing the true ‘wild world’. I am taking this thought to academia on my research proposal. Yep i know its going to be hard.
ilan says
Your blog is a source of comfort and solace for those going through challenging times.