Skip to main content

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: 
System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information.
    at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName)

This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

  • TeamCity
  • .NET 4.51
  • EF 6.0.2
  • VS2013

It seems that there is an assembly reference that is not copied to the output. If we are looking closes in the error code we notify that ‘EntityFramework.SqlServer’ is missing.
First step: Check the Unit Test project and rest of the projects to see if that assembly is there – YES
Second step: Check the ‘Copy Local’ attribute of the reference to be true – YES
Third step: Check the configuration file of unit test project to specify the SQL provider – YES
<entityFramework>
<providers>
    <provider invariantName="System.Data.SqlClient" type="System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer" />
</providers>
</entityFramework>
Forth step: Copy the source code directly from CI machine to local machine and use them to run the tests – YES (all tests are green)
Fifth step: Force that specific assembly to be loaded from code. We need to use a class from EntityFramewok.SqlServer
Type providerService = typeof(System.Data.Entity.SqlServer.SqlProviderServices);

And YES, the build is green, all tests are running. It seems that the reference was not copied in the output folder where unit tests are run.

It seems that this is a known problem for EF 6 without an official solution. I hope that in the next release of EF this problem will be solved.

Comments

  1. I was facing the same issue with Team City, I had to follow the fifth step.
    thanks

    ReplyDelete
  2. In my-case, adding system.data.entity solved the issue.
    thanks for the explanation!.

    ReplyDelete
  3. Only the Fifth step worked for me. Excellent post man! Thank you so much!

    ReplyDelete
  4. Thank you very very very much!!! Fifth step worked!

    ReplyDelete
  5. Thank you - the fifth step was the key - with a simple modification:
    It worked in Debug mode but not in Release. I suspect compiler optimizations to remove the unused variable assignment. To make it work even in Release mode, I've added a function call.
    So, my solution was to add the following line anywhere in the test project:

    var bugFix = typeof(System.Data.Entity.SqlServer.SqlProviderServices).ToString();

    ReplyDelete
  6. Solves my issue at step 1! Thanks for the explanation.

    ReplyDelete
  7. https://stackoverflow.com/a/47410082/774494

    I had encountered exactly the same problem on my CI build server (running Bamboo), which doesn't install any Visual Studio IDE on it.

    Without making any code changing for the build/test process (which I don't think is good solution), the best way is to copy the EntityFramework.SqlServer.dll and paste it to C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE. (where your mstest running)

    Problem solved!

    ReplyDelete

Post a Comment

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(

Azure AD and AWS Cognito side-by-side

In the last few weeks, I was involved in multiple opportunities on Microsoft Azure and Amazon, where we had to analyse AWS Cognito, Azure AD and other solutions that are available on the market. I decided to consolidate in one post all features and differences that I identified for both of them that we should need to take into account. Take into account that Azure AD is an identity and access management services well integrated with Microsoft stack. In comparison, AWS Cognito is just a user sign-up, sign-in and access control and nothing more. The focus is not on the main features, is more on small things that can make a difference when you want to decide where we want to store and manage our users.  This information might be useful in the future when we need to decide where we want to keep and manage our users.  Feature Azure AD (B2C, B2C) AWS Cognito Access token lifetime Default 1h – the value is configurable 1h – cannot be modified

What to do when you hit the throughput limits of Azure Storage (Blobs)

In this post we will talk about how we can detect when we hit a throughput limit of Azure Storage and what we can do in that moment. Context If we take a look on Scalability Targets of Azure Storage ( https://azure.microsoft.com/en-us/documentation/articles/storage-scalability-targets/ ) we will observe that the limits are prety high. But, based on our business logic we can end up at this limits. If you create a system that is hitted by a high number of device, you can hit easily the total number of requests rate that can be done on a Storage Account. This limits on Azure is 20.000 IOPS (entities or messages per second) where (and this is very important) the size of the request is 1KB. Normally, if you make a load tests where 20.000 clients will hit different blobs storages from the same Azure Storage Account, this limits can be reached. How we can detect this problem? From client, we can detect that this limits was reached based on the HTTP error code that is returned by HTTP