IT and Business Insights for SMB Solution Providers

Hardware Testing Best Practices

When it comes to testing hardware, longer is better. By James E. Gaskin

SYSTEM BUILDERS ROUTINELY TEST their machines for speed and reliability, both to collect superlatives (fastest, coolest, quietest, and so on) that can be used in advertising and to reduce the chances that the PCs they send to customers will get sent right back. However, no two builders seem to use the same tests or get identical results using similar hardware.

“We do a 200-point QA assessment,” says Josh Covington, director of sales and marketing for Velocity Micro, in Richmond, Va. “That covers all components and all specifications, and ensures the benchmarks are correct.”

Velocity Micro designs systems for individual gamers and workstation users, so Covington feels personally responsible for ensuring buyers receive the quality products and support they deserve. “We do all our own warranty work, so we want to catch any problems before we ship,” he says. “If a box needs service, we probably lost that customer.”

Mike Beuligmann, president of St. Louis-based eCollegePC.com, feels the same way. Both companies, in fact, spend quite a bit of time on testing. And both say the more time the better.

Tools of the Testing Trade

Beuligmann has a list of favorite software tools for assessing computers. “We start with [Memtest86+] to test the memory and then run Prime95 with Real Temp to test the CPU and memory,” he says. “Then we run 3DMark or UNIGINE Heaven 3D to test the video performance and the overall system.” The complete process takes 24 to 72 hours, depending on the build, and according to Beuligmann is highly effective at verifying system quality and helping his company maintain high customer satisfaction ratings.

Velocity Micro’s favored tools include Geekbench or PCMark to test the complete system, as well as CrystalDiskMark, 3DMark, UNIGINE Heaven, Intel Extreme Tuning Utility (XTU), and Cinebench for individual components. Since Covington prefers Samsung SSDs, his team relies on Samsung Magician for health and performance checks. The company also employs a custom application to run cores at 100 percent utilization overnight and takes care to ensure that the peripherals it uses, such as monitors and keyboards, all work well with its PCs.

The total testing time for each Velocity Micro system is one complete day. “We feel you should have at least a 24-hour burn-in period,” Covington says. “That’s how you find the defects in hardware. An hour test will find some errors, but you’ll find more in hours 15 [to] 20—if there are errors, of course.”  

Variations in Results

While some companies see variations in test results between systems, Beuligmann rarely experiences that issue. “We use only high-end components on our builds so that may be why our results are consistent,” he notes.

According to Covington, one reason test results tend to vary so widely is that vendors check the things that matter most to them. Workstations? Prove that the processors can handle the customer’s workloads without overheating and failing. Gaming boxes? Give graphics a workout and measure screen refresh rates and latency. General office machines? Test boot time, network speed, built-in Wi-Fi consistency, and other factors.

One thing Covington has discovered is that minute differences in hardware, even between identical SKUs, can make a big difference in terms of reliability. The quality of the silicon used, minor fluctuations in how much power the device draws, and even driver optimization affect performance and therefore test results. That’s one reason many system builders like extended burn-in times.

“In addition, make sure all the driver software and the firmware are as up to date as possible. If the system is overclocked, ensure complete stability before testing,” says Covington. “And one final suggestion: Never run a test just once.”

That persistence, along with patience, will ultimately pay off in the form of higher quality, happier customers, and a healthier balance sheet.

Image: Intel Extreme Tuning Utility

About the Author

James E. Gaskin's picture

JAMES E. GASKIN is a ChannelPro contributing editor and former reseller based in Dallas.

ChannelPro SMB Magazine
SUBSCRIBE FREE!

Get an edge on the competition

With each issue packed full of powerful news, reviews, analysis, and advice targeting IT channel professionals, ChannelPro-SMB will help you cultivate your SMB customers and run your business more profitably.