Skip to main content

New Unit Tests features of Visual Studio 2012


Visual Studio 2012 was already lunched some time ago. A lot of new feature were introduce with this new version. Some changes from Visual Studio 2012 are big and from some point of view, the difference between the old version of Visual Studio and the new one are like the difference between Windows 8 and Windows 7.
 I think that a lot of developers already played a little with this new version of Visual Studio. The first thing that you notify after installing it is the time that is necessary to load the new Visual Studio – that is very low. The second thing is the new UI, that is Metro like.
One zone where Visual Studio 2012 made a big step forward was Unit Testing. A lot of new feature were introduce in this area that not only increase the productivity of the developer, but also increase the quality of the code. In the next part of the post we will discover what are the main new features that were introduce from the unit testing perspective.

Run tests after each build

In the old version of Visual Studio, we didn’t have a build in mechanism to run unit tests after each build. Because of this we had to click on “Run All Tests” each time when we build the solution. This could be annoying after a while. Visual Studio 2012 already has this feature build in and with only one click we can activate or deactivate this feature.
All the tests run in background, without bothering us.

Debug/Run tests from direct from code

How many time you had to go to the Test Explorer window to be able to run your current test. From now one, Visual Studio 2012 have a shourcut in the menu that will run the test that are selected in the code.


Group and run unit test based on the category

From now one we can add one or more categories to our unit test. This attribute can be added not only to each test in part but also to the unit test class. Using this feature we can run unit tests only from a specific category. This is a great feature when not only on the developers’ machine, but also on the build machines.
To be able to run the unit tests that have a specific category from command line we need to specify the TestCaseFilter option:
Vstest.console.exe myFooProject.dll 
   /TestCaseFilter:”TestCategory="Atomic"

Assert.ThrowsException

Until now, if we had a unit test that would check if a method throws an expected exception; we would have to decorate our method with an attribute. But this is not the best solution, because there were cases when we could not determine precisely the source of the exception.
In this moment this feature is available only for Windows Store Application. We hope that this new feature will be also available for other type of projects also.

Smart unit test discovery

The algorithm that is used to discover new unit test was improved. In this moment the new unit tests are discover extremely fast. We don’t need to rebuild the solution and wait until the new unit tests are discovered by Visual Studio.

Integration with JavaScript

Using a plugin like Chutzpah Test Adapter, we are able to run unit test written in JavaScript and run them in Test Explorer. Not only this, but we can run test written in C# and JavaScript in the same time. In this way we can run the entire test as one thing. The only thing that is necessary to be done is to install Chutzpah Test Adapter and use a unit test package for JavaScript like qunit.

Unit testing is a lot easier with this new version of Visual Studio. These are only a part of the new features that Visual Studio 2012 brought to us. You can try it and discover by yourself these new great features.

Comments

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(

Azure AD and AWS Cognito side-by-side

In the last few weeks, I was involved in multiple opportunities on Microsoft Azure and Amazon, where we had to analyse AWS Cognito, Azure AD and other solutions that are available on the market. I decided to consolidate in one post all features and differences that I identified for both of them that we should need to take into account. Take into account that Azure AD is an identity and access management services well integrated with Microsoft stack. In comparison, AWS Cognito is just a user sign-up, sign-in and access control and nothing more. The focus is not on the main features, is more on small things that can make a difference when you want to decide where we want to store and manage our users.  This information might be useful in the future when we need to decide where we want to keep and manage our users.  Feature Azure AD (B2C, B2C) AWS Cognito Access token lifetime Default 1h – the value is configurable 1h – cannot be modified

What to do when you hit the throughput limits of Azure Storage (Blobs)

In this post we will talk about how we can detect when we hit a throughput limit of Azure Storage and what we can do in that moment. Context If we take a look on Scalability Targets of Azure Storage ( https://azure.microsoft.com/en-us/documentation/articles/storage-scalability-targets/ ) we will observe that the limits are prety high. But, based on our business logic we can end up at this limits. If you create a system that is hitted by a high number of device, you can hit easily the total number of requests rate that can be done on a Storage Account. This limits on Azure is 20.000 IOPS (entities or messages per second) where (and this is very important) the size of the request is 1KB. Normally, if you make a load tests where 20.000 clients will hit different blobs storages from the same Azure Storage Account, this limits can be reached. How we can detect this problem? From client, we can detect that this limits was reached based on the HTTP error code that is returned by HTTP