Software developers are paid to create functionality and solve problems. With time-to-market demands putting more of a squeeze on them to produce working code, the quality of the code they write could become a lower priority as they race to get deliverables out the door, and then later try to fix any problems that might be encountered by users.

Add to this the unique challenges presented by developing software for Microsoft’s .NET platform, and quality could become compromised even further.

“What makes .NET unique among Windows platforms is that it’s garbage collected and that it runs both on the client and on the server,” explained Chris Sells, vice president of the developer tools division at Telerik. “What this means is that apps need to be tuned differently for load times and responsiveness on the client versus overall throughput on the server. Further, apps need to be profiled for memory leaks so that they’re not swamping a user’s computer or taking down a server. These things are especially difficult for .NET programmers, because the garbage collector makes it very easy to make code that works and very difficult to write code that performs well and uses memory efficiently, because the garbage collector hides flaws from you.”

Many development organizations have adopted some form of agile and continuous build and delivery to ship products more quickly, yet some argue this forces organizations to put less of an emphasis on quality. Or, to maintain the desired level of quality, they are losing some of the advantages of agile development.

Karthik Ravindran, senior director of ALM at Microsoft, said that organizations need to strike a balance between agility, quality and scale. “There’s a lot of talk about agile and agility, but without quality, it doesn’t add business value,” he said. “You have to be fast with quality built in. Quality without agility doesn’t add value either.

“Today, in an effort to capitalize on the modern business opportunities enabled by software, decision-makers and practitioners are striving hard to strike the tough balance—and this is where the challenge exists—between agility and quality without compromising either.”

There are two ways to look at the issue of code quality: development best practices, and the use of tools to ensure quality. But how does an organization define quality?
Nail down your measurements
“Code quality is a squishy thing,” said Chris Sells, vice president of the developer tools division at Telerik. “There are all kinds of ways to measure it, from things like complexity and how many functions you’ve got, and how many lines of code you’ve got in a particular function, or how well you’ve named the variables or the function names or the comments, to how well-formed, how secure it is, how robust it is. Code quality is such a huge topic.”

A good practice for developers is refactoring. “The idea is you look at a mess of code that works sort of right, but you want to refactor it to be more efficient, more readable, more maintainable,” Sells said.

JustCode: This Visual Studio extension for code-quality analysis and refactoring supports a greater range of refactorings as well as a number of open-source unit test frameworks for low-level unit testing. A built-in rule engine reads your code as you write it and gives comments on things that aren’t done according to that set of best practices. “In .NET, it’s easy to make code work, but you’re not sure you’ve done it the right way,” said Sells. With JustCode, developers “get an education as they write, and it makes you a better coder as you code,” he said.

JustTrace: A standalone tool that profiles an application to detect such things as memory leaks or performance issues before they become big problems. The tool takes a snapshot of the problem area and isolates it for faster remediation.

JustMock: This low-level developer tool gives users the ability to create a fake environment in which to write code so it doesn’t run against the production environment.

SELLS RECOMMENDS: “Entire books have been written about [quality]. I recommend ‘.NET Coding Style Guidelines.’ Take a look at that book, use the tools and use that profiler. Check for memory leaks, check for performance problems. Those are my top three.” He added that .NET programmers “need to be proactive about performance and memory usage tuning for .NET apps.”
Use small methods
Mark Miller, chief architect of IDE tools at DevExpress, said one of the best things developers can do to create quality code is to use small methods. “There are a number of benefits here for higher code quality,” he said. “New developers can get up to speed faster, it takes less time for humans to understand it, and a small method serves a single purpose.”

Source code, Miller said, is a company asset, and it makes sense for the company to invest in the quality of that asset. He is a believer in test-driven development. “Test cases allow you to set expectations as to how the code is to perform. The problem is, they’re time-consuming to create. For every method you write, you’ll have multiple test cases. That’s another reason to write small methods,” he said.

One of the tenets of test-driven development is “consume first,” where you spell out what you want the application’s consumers to do before you create the code. “If the application is hard to use, you’re not thinking of the consumer first,” Miller said.

He elaborated: “As an example, I might have an engine, then inside the engine it might have a method called print, and I might pass it a document… That’s an example of a simple method that would be consumed by developers. Well, it turns out that it’s easier to write code that is easier to consume by developers if you write it in what’s called a consume-first manner, where you write the call-in code first. You say, ‘Here’s what I want my consumers to do.’ And here’s why that’s important: If you don’t do that, if you don’t think about the consumer developer working with the library, then you’ll typically create a method, and there are no constraints on ease of use for that method.

“If you need to add parameters, you’ll have as many parameters as that requires. If it’s hard to use, you have no idea because you’re not thinking about it from a consume-first point of view. So consume-first code creation is also something that’s very useful when you’re creating libraries, or you’re working in a large team and you’re creating code that will be consumed by other developers. If you don’t think about it carefully from the consumption standpoint, you’re likely to create code that’s harder to use, harder to consume.”

Other practices for code quality Miller espoused include pair-programming, making sure variable names are meaningful and consistent, and to consolidate duplicate code.

CodeRush: This Visual Studio add-in enables “coding at the speed of thought,” said Miller. The software highlights code structure for deeper understanding, and visualizes the things that are happening with complex code so debugging is made easier. Further, developers can use CodeRush to prototype software, and take advantage of code providers that he described as “kind of like autocomplete or Microsoft’s code snippets on steroids.” Code templates offered in CodeRush help team members adhere to an organization’s coding standards, and thus “the quality of everyone’s code is bumped up a little bit,” he said.

CodeRush also enables better refactoring by detecting duplicate code and offering solutions to consolidate that code so it can more readily be understood and maintained. The tool extracts the interface so that all refactoring takes place in the code, letting developers “think about design rather than operating the tool,” Miller said.

Finally, CodeRush provides code analysis that can locate and turn a spotlight on issues with the code “on the fly,” and makes remediation one-click simple.

MILLER RECOMMENDS: “Refactor like crazy. When you see a problem, you fix it.”
Code review a ‘cornerstone’ of quality
Sergei Sokolov, vice president of product management, testing and performance at SmartBear Software, said he believes quality begins with coding practices and code review, which he called “a cornerstone of quality.”  

He said code review and static analysis are the best tools that serve the two-sided quality issue of satisfying end users and fellow developers. “There are of course different parameters that you can apply to set up a process with them, because with both, you can quickly get to a state where the effort to sustain these practices outweighs the benefits,” he said.

“One can argue that finding one critical bug is worth a great many hours of effort spent on sustaining the process. But if you set it up productively, in that you don’t check for too many things, then you will observe a positive impact to both code maintainability and the overall code quality for the end user.

“With .NET or anything else, based on what I’ve read over the years and what I see in our development teams, really good practices haven’t changed,” Sokolov continued. “The technology has changed, the level at which the application is described sometimes changes, some of the implementations change, but the basic practices as they have been established probably around almost 20 years ago are still there.”

For .NET development, he said C# is a simpler language than C and C++, and has evolved to address some of the typical problems of those languages relative to memory and other issues. “Threading is a problem. Memory management, even though the simplistic forms of that have been addressed with the runtime, is still a problem, because you can paint yourself into a corner if you’re not watching what you’re doing,” he said.

“Security is always a problem, because it goes beyond a language; it’s a paradigm, and therefore, there may be practices that may be language-specific, but there may be far more practices that are implication-specific. So then you kind of go beyond the .NET paradigm and you talk about the general principles of addressing databases and building queries and validating inputs, etc., which are universal.”

CodeCollaborator: A peer-review tool that integrates with Visual Studio 2012 and supports Microsoft’s Team Foundation Server source control, creating a workflow that provides more traceability to the code-review process.

AQtime Profiler: A performance runtime, memory and exception profiler, and code coverage tool that is a straight plug-in to and a superset of these capabilities in Visual Studio.

SoapUI: This standalone tool offers automated testing of Web services that Sokolov said is “perhaps the most widely used test tool on the planet,” though he acknowledged it does not support Windows Communication Foundation at this point.

Test Complete: SmartBear’s functional test automation tool has supported Windows technologies throughout its life, supporting Visual Studio from 2002 through 2012, plus Silverlight and .NET. It’s a very capable record/replay/script data-drive test framework that works with all Microsoft technologies, Sokolov said. Tests created in this tool can be checked into TFS and then made visual in Visual Studio, he explained.

SOKOLOV RECOMMENDS: “I’m a big fan of code review and a big fan of static analysis.”