Microsoft is offering a first look at the next version of its Visual Studio integrated development environment (IDE) and platform, which will be named Visual Studio 2010 and the .Net Framework 4.0. There's a lot promised in the new release (expected to ship, duh, in 2010), from improved software testing tools to software engineering modeling capabilities to integrated development and database functions for the application lifecycle management (ALM).
I could bulk that up with a lot more feature buzzwords, but the bottom line is that Microsoft is putting its attention on improving Visual Studio for the benefit of every one of its users—from the CIO to the software architect to the enterprise developer to the software testing team. I was given an overview of some of these new-and-improved features, particularly in Visual Studio Team System 2010 (VSTS, code-named Rosario) by Dave Mendlen, Microsoft's director of developer tools, and Cameron Skinner, VSTS product unit manager.
But it would be too easy to simply report on vendor promises and rely on my own jaded programmer-meets-journalism background. With Microsoft's permission, I shared some of the product details with experienced programmers and software quality assurance professionals, so you can learn how the software developers in your shop may respond to the new software (or at least, to its promises based on demo-ware).
A key goal in VSTS 2010, says Microsoft, is to help democratize ALM by bringing all members of a development organization into the application development lifecycle, and remove many of the existing barriers to integration.
One way that Visual Studio 2010 will do this is to break down the ALM roles, from the business decision maker (who needs a project overview but doesn't want to be bogged down in details) to the lead developer or system architect (who enables the software infrastructure and draws the blueprint), to the developer who writes the code and the database administrator (DBA) who integrates it with the company database to the testers (who make sure the software is of high quality).
For the IT manager or CIO, says Mendlen, VSTS will give clarity and visibility into the state of the project throughout the lifecycle, using Team Foundation Server-enabled dashboards customized for her role. The dashboard can answer high level questions such as ongoing project cost or project status.
Agile Tools, Built-In
Visual Studio 2010 also will sport features to integrate Agile methodologies into the tech stack using Team Foundation Server. Skinner explains, "We'll include in the [VSTS] box an Excel workbook for teams that are leveraging, say, the Scrum process so they can get burndown from their project." These features, he says, will let Agile teams track daily progress, see projects broken down into iterations and use sprints.
Skinner and Mendlen are obviously very proud of Microsoft's investments around modeling, pointing to the company's recent membership in the Object Management Group and their work to advance design tools with UML. They aim to take these software engineering efforts further. "Tools for architects have traditionally been designed for that population," says Mendlen, "But they could be useful across the lifecycle. One place we think that's very valuable is for developers."
So in this upcoming Visual Studio Team System, an "architecture explorer" tool will let the user interrogate the code base and graphically display its architecture, including the relationships in the code and their dependencies. Microsoft hopes this feature will help developers see what their code really looks like (not just what they imagined). While this feature will obviously be valuable for iterative development and debugging purposes, it also will help a developer who inherits a project and needs to figure out its design. If it works as promised, it might help save a developer a pizza-and-Jolt fueled weekend spent wondering, "What the heck is this
F variable supposed to be doing?"
"I love the idea of the architecture map for inherited projects that are not well-documented," comments Julia Lerman, .Net Mentor and author of O'Reilly's Programming Entity Framework ."Or better yet, when I return to an old project that I wrote for my own personal use and didn't bother to document very well."
Cory Foy, a development manager from Tampa, is a little less thrilled by the promised architectural validation features. "This has been somewhat available in the Team Edition for Architects," he says, but "I've known very few teams who have bothered," Foy says. "The concept is wonderful sounding, especially to a CIO or development manager: Draw a diagram of your data center, of your deployment, of your application and of your architecture—and you can magically 'validate' that everything will work when deployed," he explains.
In practice, however, even when teams go through all of that effort, someone has to maintain all of those diagrams, Foy says. "They typically fall out of sync rather quickly, and just as quickly get abandoned. Until there are ways of automatically keeping those things in sync (datacenter machines report out what they have, applications report what they need based on dependencies, deployment diagrams are generated from build scripts, and architectures are created based on code), and all of this is automated as part of the build/continuous integration process) I predict this either not being well adopted, or being a hindrance."
The last release of Visual Studio had support for a continuous build process. Inspired in part by Agile methodologies, says Mendlen, "We're taking it to the next level." The build process will have a customizable workflow engine, ensuring that, say, when a developer checks in her code, it goes to the architect. The build process can be tuned for the specific problem domain, Mendlen says.
Part of that build process includes architectural validation. If the software specs say that the user interface code must not directly call the data layer, VSTS will be able to flag as an error any code that breaks that rule. New tools will help the development staff visually draw diagrams that describe the software's architecture, and map the physical assets to existing code. As part of the check-in policy, says Skinner, "We'll tell if the code is abiding by those constraints" and let the developers act on it appropriately. Just what a developer does about noncompliance is up to the development team leaders; besides, these architecture validation features are just as likely to identify design bugs as they are check-in noncompliance issues.
"The more that Microsoft is [enables] developers to engage in Agile techniques, right out of the box, the more adoption you'll see," opines Lerman. "I've witnessed this in the .Net user group I lead. Once unit testing was embedded into VS2008 Professional, many members were suddenly interested in learning how to test—even though tools like nUnit have been around and highly regarded for years."
Next: Putting Quality Earlier in the Development Lifecycle
Putting Quality Earlier in the Development Lifecycle
One sometimes-stressful interaction in the application development lifecycle is the tension between developers and testers. Developers have to do a better job of testing their code before they send it off to the software testers. Developers don't always know which unit tests they have to run, and often they don't have the time or inclination (your own cynicism-meter can determine which) to run the tests anyway.
"We've made some significant strides here," says Mendlen. In Visual Studio 2010, he says, tools will help create a direct relationship between unit tests that must be run and the code a developer is writing. This "must-do" testing feature, called test prioritization, will get the developers to check their code against at least the highest-priority unit tests. (It, too, shows the impact of Agile methodologies.)
"I love the idea of having a tool help me figure out which tests I should be running since I'm not a pro at that," says Lerman.
This is a good thing for acceptance tests, but not developer tests, points out Jeff Grigg, an independent consultant in Saint Louis. "Acceptance tests are specified by the customer and should fail until the functionality is implemented. For this usage, the VS2010 improvements could be quite useful," he says.
On the other hand, Grigg continues, developer tests written for Test Driven Development (TDD) should pass shortly after being written. They should be run frequently—before each checkin, at a minimum—and so should never fail, for checked in versions of the software. Explains Grigg, "If VS2010 continues to run tests as slowly as previous VS versions, this testing tool will continue to be inadequate for effective Agile Test Driven Development usage. It's a solvable problem: Resharper, by JetBrains, and most open source .Net "xUnit" testing tools do run the tests much faster. (To do this, one must run all the tests in a single process.)"
Kurt Häusler, a software developer and Scrum Master at FotoFinder Systems in Germany, is hopeful about the software testing tools but sets his expectations low. "MS Test has a lot of shortfalls," he says. "I wonder if its not too late to regain market share from nunit, mbunit, typemock and rhino mocks, etc. (assuming they are going to provide some sort of mocking framework). But presumably they are going for the 100 percent Microsoft shops that have either not embraced unit testing yet, or are struggling with MS Test."
Another type of animosity results when a software tester finds and reports a bug, and the developer replies, "Sorry, can't replicate the problem. It works on my box! (Have a nice day.)" Testers have to spend time convincing the developer that the problem exists, and identify the criteria under which the software fails. This does not encourage them to become drinking buddies.
So, new in Visual Studio 2010 will be a video-capture feature recording what's going on as the QA professional tests the software. The video can be handed off to the development team. The developer to whom the bug is assigned can see what happened, identify the steps leading up the crash and, said Skinner, generate "the audible gasp of the developer realizing he needs to get his environment to the right state to work on the debugging process."
This isn't only video capture. These tools (which internally the VSTS team currently calls "Tivo for debugging" but surely will have a marketing-approved name by release time) also record the state of the application as the tester is testing it. Think of this enhanced debug log as an airplane's black box, says Skinner, recording all the statuses and states and variables (a quad core Intel system with 2GB of RAM, for instance, not just which Windows service pack), with the ability for the developer can run and play it back.
In other words, when the QA person clicks "Create new bug," the video and rich debug log become part of the record. The developer to whom the bug is assigned can click on the video to see the software's behavior. Her IDE is put into a state that looks as though she just stepped into the debugger. She's not actually running the app, said Skinner; but is "playing back the Tivo recording."
This feature generates some developer enthusiasm. "I am rather excited by the 'Tivo for debugging' even though I can see lots of places managers will want to misuse it," says Foy. "Communication in a team environment is vital, and being able to capture what was going on in the screen and play it back is a wonderful feature that I'm sure my teams would use."
"I'd certainly like to have this tool," says Grigg. "I've seen QA testers, on several projects, refrain from reporting difficult-to-reproduce bugs because, without an automated test script that demonstrates the problem, it is often very difficult for the developers to make any progress on the issue."
In fact, Grigg wants more: Ideally, the functionality should be available to end users who discover problems in the field. "In my experience, that's the source of our most challenging support issues," he says.
Overall, says Foy, "It looks like Microsoft is starting to 'get it' in some ways, and in others is making it harder for teams to get the full experience without going 'all in.'" In Foy's opinion, that's because many of the advanced features that teams really need and want to use—Agile modeling, architecture explorer, and possibly test prioritization—will only be available in the Team Edition or the full Suite. "This is typically a $4,000 to $8,000 investment for many teams on top of Visual Studio Professional," Foy points out. Plus, he says, Microsoft hasn't always done a good job of putting features in the right editions. For example, in VSTS 2005, developers had to either choose to lose all of the other great developer tools by picking the Tester edition, or have to purchase the full suite, he says.
Next: Merging the developer and DBA roles