SQL Server LocalDB 2014 Connection String

I always face issues for LocalDB connection string when download GitHub code developed using SQL Express 2012 – LocalDB.

I assumed that I could just update my connection string from v11.0 to v12.0 but it seems that Microsoft have changed the naming scheme for this version. Now the automatic instance is named MSSQLLocalDB.

So, For SQL Server 2012 LocalDB, I had this connection string:

<connectionStrings>
  <add name="DefaultConnection"
   connectionString="Data Source=(LocalDb)\v11.0;AttachDbFilename=|DataDirectory|\Test.mdf;Initial Catalog=Test;Integrated Security=True"
providerName="System.Data.SqlClient" />
</connectionStrings>

For SQL Server 2014 LocalDB the connection string should be:

<connectionStrings>
 <add name="DefaultConnection"
  connectionString="Data Source=(LocalDb)\MSSQLLocalDB;AttachDbFilename=|DataDirectory|\Test.mdf;Initial Catalog=Test;Integrated Security=True"
providerName="System.Data.SqlClient" />
</connectionStrings>

You also need to update, Entity Framework default connection factory setting in web.config file where v11.0 should be v12.0 for SQL Server 2014 LocalDB.


<defaultConnectionFactory type="System.Data.Entity.Infrastructure.LocalDbConnectionFactory, EntityFramework">
  <parameters>
    <parameter value="v12.0" />
  </parameters>
</defaultConnectionFactory> 

Hope this will help 🙂

SQL Server Data Tools to Visual Studio 2013 – Database Reverse Engineering

Microsoft created SQL Server Data Tools for Visual Studio 2013, to make development easier.

  • Single Tool to support developer’s needs
  • Can build, debug, test, maintain, and refactor databases
  • Developers can use familiar Visual Studio tools for database development
    • Code navigation, IntelliSense, C# language, platform-specific validation
      • Debugging, and declarative editing in the Transact-SQL editor
  • Works connected or disconnected (Tools work on top of design time)
  • Have Schema Model differencing capabilities (Compare and Update Model)
  • Schema and app under TFS control
  • Publish to all supported SQL platforms

Read Full Article – Click

Hope this will help !!!

Jay Ganesh !!!!!!

 

Continuous Integration with Visual Studio 2010

Recently, I was looking for some free tools which can help me in Continuous Integration for my project. I was developing a project in C# using VS2010 and  using TortoiseSVN and AnkhSVN for version control.

Scenario:

I want a FREE tool that runs on the build server and checks for fresh commits. If the commit breaks the trunk then I want email notifications sent. I also want this tool to run all his MSTest tests periodically and send emails if there is a failed test.

Solution 1: CruiseControl.NET

CruiseControl.NET is an Automated Continuous Integration server, implemented using the .NET Framework.

Build Server Scenarios

  • Setting up Source Control
  • Build on Check-in
  • Add unit tests
  • Add Coverage
  • Add source code analysis
  • Add packaging
  • Deploy Package

Check this for more details: http://www.cruisecontrolnet.org

Solution 2: TeamCity

TeamCity is free for up to 20 build configurations and has an easy to use Web/GUI interface.

It Provides

  • building Visual Studio solutions; native support for MSBuild, Powershell or NAnt
  • code analysis for C#, VB.NET, XAML, and many other languages powered by ReSharper
  • testing with .NET testing frameworks, including: NUnit, MSTest, MSpec, xUnit and all Gallio-based frameworks
  • code coverage with dotCover, NCover or PartCover
  • best-in-class NuGet support

Check this for more details: https://www.jetbrains.com/teamcity/features/

Solution 3: Jenkins/Hudson

Jenkins is an award-winning application that monitors executions of repeated jobs, such as building a software project or jobs run by cron. Among those things, current Jenkins focuses on the following two jobs:

  • Building/testing software projects continuously, just like CruiseControl or DamageControl. In a nutshell, Jenkins provides an easy-to-use so-called continuous integration system, making it easier for developers to integrate changes to the project, and making it easier for users to obtain a fresh build. The automated, continuous build increases the productivity.
  • Monitoring executions of externally-run jobs, such as cron jobs and procmail jobs, even those that are run on a remote machine. For example, with cron, all you receive is regular e-mails that capture the output, and it is up to you to look at them diligently and notice when it broke. Jenkins keeps those outputs and makes it easy for you to notice when something is wrong.

Check this for more details: https://wiki.jenkins-ci.org/display/JENKINS/Meet+Jenkins

Check this for Jenkins.NET: http://justinramel.com/2012/09/17/jenkins-dot-net/

Summary

All these tools are free and you can choose any of them. They serve the same purpose “CI – Continuous Integration“.

Hope this will help !!!

Jay Ganesh

Open Source Code Coverage Tool for .NET

In computer science, code coverage is a measure used to describe the degree to which the source code of a program is tested by a particular test suite. A program with high code coverage has been more thoroughly tested and has a lower chance of containing software bugs than a program with low code coverage.

Read more – http://en.wikipedia.org/wiki/Code_coverage

OpenCover – an open source code coverage tool for .NET

An open source code coverage tool (branch and sequence point) for all .NET Frameworks 2 and above (including Silverlight). Also capable of handling 32 and 64 bit processes. Use ReportGenerator for best viewing results (also available via Nuget).

To install OpenCover – an open source code coverage tool for .NET, run the following command in the Package Manager Console

PM> Install-Package OpenCover

OpenCover does not directly execute your tests but instead needs to execute another application that executes your tests in this case we are using MSTest.

First lets create a batch file that we can execute our tests from the command line:

OpenCover.Console -register:Administrator -target:”C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\MSTEST.exe” -targetdir:”D:\Source\Build\Test” -targetargs:”/testcontainer:MySoftware.Test.dll” -filter:”+[*]* -nodefaultfilters -mergebyhash -output:results.xml -target :< target application>

It can be a directory of executable file Nunit or MSTEST.

-targetdir:

It is the application directory whose code coverage needs to be executed. It should contain all referenced dlls along with PDB files (which is mandatory for code coverage).

-targetargs:

Parameters supported for the target application.

Ex:/testcontainer:testdll
-filter :< space separated filters>

A list of filters to apply to selectively include or exclude assemblies and classes from coverage results. Filters have their own format ± [module-filter]class-filter. If no filter(s) are supplied then a default includes all filter is applied +[*]*. As can be seen you can use an * as a wildcard. Also an exclusion filter (-) takes precedence over an inclusion filter (+).

-nodefaultfilters

A list of default exclusion filters are usually applied, this option can be used to turn them off. The default filters are:
-[mscorlib]*
-[mscorlib.*]*
-[System]*
-[System.*]*
-[Microsoft.VisualBasic]*
-mergebyhash

Under some scenarios e.g. using MSTest, an assembly may be loaded many times from different locations. This option is used to merge the coverage results for an assembly regardless of where it was loaded assuming the assembly has the same file-hash in each location.

-output :< results file name>

Coverage results will get stored in the above file and it resides in the opencover directory or the specified location in the parameter. When executed, OpenCover will produce an XML file (default results.xml) that contains all the data related to that test run.

ReportGenerator

ReportGenerator converts XML reports generated by PartCover, OpenCover or NCover into a readable report in various formats. The reports do not only show the coverage quota, but also include the source code and visualize which line has been covered.

ReportGenerator supports merging several reports into one. It is also possible to pass one XML file containing several reports to ReportGenerator (e.g. a build log file).

The following output formats are supported by ReportGenerator:

  • HTML, HTMLSummary
  • XML, XMLSummary
  • Latex, LatexSummary
  • TextSummary
  • Custom reports

D:\downloads\ReportGenerator_1.6.1.0\bin>ReportGenerator “D:\Source\Build\Test\results.xml” “D:\downloads\opencover.4.0.804\”

If we open the produced coverage output (coverage\index.htm) we can see the visualization of the coverage of our target code.

Reference Link:

Hope this will help !!!

Jay Ganesh

 

Improving ASP.NET Security with Visual Studio 2010 Code Analysis

Anyone doing ASP.NET development probably admits, openly or not, to introducing or stumbling upon a security issue at some point during their career. Developers are often pressured to deliver code as quickly as possible, and the complexity of the platform and vast number of configuration options often leaves the application in a less than desirable security state. In addition, the configuration requirements for debugging and production are different, which can often introduce debugging settings in production, causing a variety of issues.

Over the years, the ASP.NET platform has matured and better documentation has been made available through MSDN and community blogs, but knowing which feature or configuration setting to use is often troublesome. Even with good knowledge of the security functionality, mistakes can happen that could result in security vulnerabilities in your application.

Peer code review is a useful process and a good way to catch issues early. Still, not everyone has the time or budget—or knowledgeable peers at hand—for such review.

Since the introduction of code analysis in Visual Studio 2005, developers have been able to automatically analyze their code to see if it complies with a series of best practices ranging from design, maintainability, performance and security. So far, code analysis has been a great tool, but it hasn’t focused on providing best security practice guidance for ASP.NET—until now.

In this article I’ll introduce you to the new ASP.NET code analysis rules that can be used with Visual Studio code analysis as well as with the standalone FxCop application to improve the security of your ASP.NET applications.

Read more…

Hope this will helps !!!

Jay Ganesh