Emerging Technologies

Introduction to NDepend : Static Code Analysis Tool

June 16, 2018 .NET, .NET Core, .NET Framework, ASP.NET, Best Practices, C#.NET, Code Analysis, Code Quality, Dynamic Analysis, Emerging Technologies, Help Articles, Microsoft, Static Analysis, Tech-Trends, Tools, Tools, Visual Studio 2017, VisualStudio, Windows No comments , , , , , ,

As a developer, you always have to take the pain of getting adapted to the best practices and coding guidelines to be followed as per the organizational or industrial standards.  Easy way to ensure your coding style follows certain standard is to manually analyze your code or use a static code analyzer like FxCop, StyleCop etc. Earlier days I have been a fan of FxCop as it was free and it provides me all necessary general guidelines in terms  of improving my solution.

In this modern world of programming everything needs to be automated, as it saves time and money in terms of automating repetitive tasks and improves efficiency. This is where static code analysers coming effective.

What is Static Code Analysis?

Static program analysis is the analysis of computer software that is performed without actually executing programs, on some version of the program source code, and in the other cases, some form of the object code or intermediate compiled code .

Sophistication of static program analysis increases is based on how deep they analyze in terms of behavior of individual statements and declarations, to analyzing the entire source code.

PS: Analysis performed on executing programs is known as dynamic analysis.

In this article I will give you an overview of one such premier static code analysis tool that can be used for your daily development routine plus use it for CI integration for DevOps efficiency.

NDepend:

NDepend is a static analysis tool for .NET, specifically for managed code:  NDepdend supports a large number of code metrics, allowing to visualize dependencies using directed graphs and dependency matrix. It also performs code base snapshots comparisons, and validation of architectural and quality rules.

The important capabilities of NDepend are:

  • Dependency Visualization through dependency matrix and graphs.
  • Analyse and generate software quality metrics – as per the documentation it supports 82 quality metrices.
  • Declarative rule support through LINQ queries, and it is called CQLinq and comes with a large number of predefined CQLinq rules.
  • Integration support for Cruise Control.Net, SonarCube, am City. Code rules can be configured to be checked automatically in Visual Studio or during continuous integration(CI).

License: NDepend is a commercial tool with licensing options as below:

  1. Developer seats – $477 approx. / per seat.
  2. Build Machine seats  – $955 approx. / per seat.

** You could get volume discount if you bulk procure your licenses.

Installation: 

Once you obtained license you will able to download NDepend_2018.1.1.9041.zip, is latest version available while I write this article. Extract the zip file into your local folder, you could see the different packages/executables within the package.

image

1.) NDepend.Console    – Command line program to execute NDepend analysis.  You would be mostly using this component on CI Build server Help

2.) NDepend.PowerTools –  Helps write your own static analyzer based on NDepend.API, or tweak existing open-source Power Tools. Help

image

3.) NDepend.VisualStudioExtension.Installer – To install NDepend extension as part of Visual studio

image

4.) VisualNDepend – Independent visual environment for managing your NDepend tasks.

image

Visual Tool gives you different options to choose from:

  • You can analyse a Visual Studio Solution or project.
  • Analyse .NET assemblies in a folder.

image

ndepend

For the demo purpose, I choose one of it’s own NDepend.PowerTools project.

image

image

Demo: Summary Report

image

Demo: Application Metrics

image

Demo: Dependency Dashboard:

image

Demo: Interactive Graph

image

Demo: Code Matrix View

image

Demo: Quality Gates Summary

image

Demo: Rules Summary

image

Conclusion:

NDepend is one of the best enterprise grade commercial static analyser seen so far.  There are Visual Studio Code Analysis, FxCop and Stylecop Analyzer tools available but they do not provide extensive level of analysis reports NDepend provides. Being a commercial tool it gives value for money for customers by what they need.  In terms of a day to day developer  or devops lifecycle, you can integrate NDepend in your build process, which could be simple as executing the NDepend Console and reviewing the output. With NDepend’s API it is easy to develop your own custom analysis tools based on CQLinq and NDepend.PowerTools(which is open source). You could find all the detailed help in NDepend documentation.

References:

CosmosDb – Connection Policy – Setting Connection Mode and Connection Protocol

May 13, 2018 .NET, Azure, CosmosDB, Microsoft, PaaS, VisualStudio, Windows, Windows Azure Development No comments , , ,

Recently I have been trying multiple ways to optimize CosmosDb SQL.NET SDK integration calls from my web application that sits within a VNET.

After carefully analyzing different options available within Cosmos Db SQL API’s have realized there are different aspects we could optimize in achieving minimal turn around time. In this article I am going to discuss about one such useful find, that is to use Cosmos Db SQL SDK connection policy to use diferent networking options to improve the latency between web application and cosmos db API calls.

Connection Policy:

Performance of an client application has important implication based on – how SQL .NET SDK  connects to Azure Cosmos DB , because of expected client-side latency due to networking conditions. There are two key configuration settings available for configuring client Connection Policy – the connection mode and the connection protocol.

There are two connection mode options provides by Cosmos Db SQL.NET SDK:

  • Gateway Mode(which is default): This mode is the default option being used and works with all Cosmos DB SDK versions.  Since it is only accessible over HTTPS/TCP, it is more secure and best choice for applications that run on a constrained secure corporate network. If you are using the .NET Framework version of the CosmosDb SQL.NET SDK, then proably this is the only connection mode that would work for you. 

  • Connection Protocol – TCP:  443 is the CosmosDb port, 10255 is the MongoDB API port.   
  • Connection Protocol – HTTPS: Default 443
  • Direct Mode:  This is a new mode which will work only on .NET Standard 2.0 onwards. It provides you an ability to choose between TCP or HTTPS more efficiently.  Only caveat is that you would need .NET Standard 2.0 as target framework for your client application.
    • Connection Protocol – TCP: TCP would be more faster when client and db are in same VNET.  Since TCP within the same network would be more faster, you would be amazed by the latency improvements by your client application. It would respond faster to you cosmos Db requests.  NB In TCP mode apart from 443 and 10255 mentioned in Gateway more, we also need to ensure  port range between 10000 and 20000 is open in your firewall configuration,  because Azure Cosmos DB uses dynamic TCP ports.
    • Connection Protocol – HTTPS: Since client application and cosmosDb are in same network limits, you could see that HTTPS option is also a reliable, secure and faster access channel for you, but not highly performing as TCP.

    A simplified diagram below :

    image

    Sample Code:

     string cosmosDbEndpoint = new Uri("https://mycosmosDbinstance.documents.net");
     string authKey ="cosmosDb-apiKey";
     DocumentClient client = new DocumentClient(cosmosDbEndpoint, authKey,
     new ConnectionPolicy
     {
        ConnectionMode = ConnectionMode.Direct,
        ConnectionProtocol = Protocol.Tcp
     });
     

    Refer more :

    You can find the completed sample here: AzureContrib/CosmosDB-DotNet-Quickstart-With-ConnectionPolicy

    Blazer – The new experimental web framework from Microsoft

    May 2, 2018 .NET, .NET Core, .NET Core 2.0, C#.NET, Emerging Technologies, Microsoft, Razor No comments

    In this world of multiple Web frameworks Microsoft would not want to stop experimenting with new frameworks for Web development. Innovation is a key to Microsoft, doesn’t matter the start later than the React(Facebook) and Angular(Google) , but Microsoft has proven most of the times they are good in developing cutting edge frameworks.  That’s how Blazer has born.

    Blazer = Browser + Razer

    As a ASP.net MVC developer I always loved Razer syntax that was shipped with ASP.NET MVC 3.0. Since then Microsoft has improved the Razor framework with async/await patterns and fluent syntaxes etc.

    Concept is simple, use .NET for building browser based apps. Your familiar C# and Razor syntax can add lots of improvements in the way you build browser apps as a modern day web developer.

    Why use .NET?

    To simplify this question, quoting an excerpt  from Microsoft ASP.NET team blog:  “Web development has improved in many ways over the years but building modern web applications still poses challenges. Using .NET in the browser offers many advantages that can help make web development easier and more productive:

    • Stable and consistent: .NET offers standard APIs, tools, and build infrastructure across all .NET platforms that are stable, feature rich, and easy to use.
    • Modern innovative languages: .NET languages like C# and F# make programming a joy and keep getting better with innovative new language features.
    • Industry leading tools: The Visual Studio product family provides a great .NET development experience on Windows, Linux, and macOS.
    • Fast and scalable: .NET has a long history of performance, reliability, and security for web development on the server. Using .NET as a full-stack solution makes it easier to build fast, reliable and secure applications.

    Blazor will have all the features of a modern web framework including:

    • A component model for building composable UI
    • Routing
    • Layouts
    • Forms and validation
    • Dependency injection
    • JavaScript interop
    • Live reloading in the browser during development
    • Server-side rendering
    • Full .NET debugging both in browsers and in the IDE
    • Rich IntelliSense and tooling
    • Ability to run on older (non-WebAssembly) browsers via asm.js
    • Publishing and app size trimming

    Now the usual question arises? How is that possible? Running .NET in a Browser?

    It is all started with WebAssembly, a new web standard for a “portable, size- and load-time-efficient format suitable for compilation to the web.

    • WebAssembly enables fundamentally new ways to write web apps. Code compiled to WebAssembly can run in any browser at native speeds.
    • WebAssembly is the foundational framework needed to build a .NET runtime that can run in the browser.
    • No plugins or extensions required.

    Getting Started with Blazer:

    Latest version of blazer framework available is 0.3.0 released on 02/05/2018.

    Steps to setup Blazor 0.3.0:

    1. Install the .NET Core 2.1 SDK (2.1.300-preview2-008533 or later).
    2. Install Visual Studio 2017 (15.7 Preview 5 or later) with the ASP.NET and web development workload selected.
    3. Install the latest Blazor Language Services extension from the Visual Studio Marketplace.

    Install the Blazor templates using command-line:

    dotnet new -i Microsoft.AspNetCore.Blazor.Templates
    

    Additional References:

    Setting up Local NPM repository to Speedup Dev/CI Builds

    April 29, 2018 Emerging Technologies, JavaScript, JavaScript, Modern Web Development, TypeScript, Web No comments , , ,

    As a modern day JavaScript developer working with Node.js and NPM, it has been always any developer’s case to clean up local node modules sometimes when local build is broken. It is a tedious tasks to cleanup %appData%\npm-cache  to do a fresh install of all the modules again. Depending on the number of modules your project have, you will get stuck up for few minutes to hours to complete npm module installation depending on your Internet bandwidth.

    Another scenario we can think of it on a build server or CI server, where we need to cleanup the entire modules during each build process, and ‘npm install’ would be like a fresh start, would take longer time to have your build complete.

    What if we have a simple way of caching these packages locally, so that we do not have to download again from Internet every-time.  I will help you with a simple solution, that once setup will resolve some of these problems effectively.

    Introducing Local-NPM


    local-npm is a Node server that acts as a local npm registry. It serves modules, caches them, and updates them whenever they change. Basically it’s a local mirror, but without having to replicate the entire npm registry.

    This allows your npm install commands to (mostly) work off-line. Also, your NPM modules  get faster and faster over time, as commonly-installed modules are aggressively cached.

    local-npm acts as a proxy between you and the main npm registry. You run npm install commands like normal, but under the hood, all requests are sent through the local server.

     

    Getting Started with Local-NPM:

    Step 1: Install the module ‘local-npm’

    $ npm install –g local-npm

    Step 2: launch local-npm  and this will start the local npm server

    $ local-npm

    This will start the local npm server at localhost:5080.

    http://127.0.0.1:5080

    PS: Please note that, this step would take some time as this module tried to replicate the entire NPM repository remote skimdb to the local couchdb instance for efficient caching. But it will not eat up your disk space, as it caches modules based on usage only, it will not replicate the entire NPM repository.

    Step 3: Validate the local-NPM registry

    There is a basic NPMJS like UI to browse through local packages which can be accessed through.

    http://localhost:5080/_browse.

    Step 4: Then set npm to point to the local server:

    $ npm set registry http://127.0.0.1:5080

    Step 5: run  “npm install” of your modules and you can see that local-NPM caches these modules that you regularly use.

    Incase, to switch back to default NPMJS registry, you can do:

    $ npm set registry https://registry.npmjs.org

    How it works?

    npm is built on top of Apache CouchDB (a No-SQL db), so local-npm works by replicating the full “skimdb” database to a local PouchDB Server.

    You can inspect the running database at http://127.0.0.1:16984/_utils.

    References

    To understand more on local-NPM and documentation visit the module repository in [email protected]https://github.com/local-npm/local-npm

    Introduction to Kubernetes

    April 22, 2018 Cloud Computing, Cloud Native Computing Foundation, Computing, Emerging Technologies, Google Cloud, IaaS, OpenSource, PaaS, Platforms No comments

    What is Kubernetes?

    Kubernetes (a.k.a K8s) is an open-source system for automating deployment, scaling and management of containerized applications that was originally designed by Google and now maintained by the Cloud Native Computing Foundation.

    What Kubernetes can do?
    Kubernetes has a number of features in cloud computing world, it can be thought as a :

    • A container platform
    • A microservices platform
    • A portable cloud platform and a lot more

    Kubernetes defines a set of building blocks (“primitives”) which collectively provide mechanisms for deploying, maintaining, and scaling applications. The components which make up Kubernetes are designed to be loosely coupled and extensible so that it can meet a wide variety of different workloads. The extensibility is provided in large part by the Kubernetes API, which is used by internal components as well as extensions and containers running on Kubernetes.

    If you are interested  to know more, learn more about Kubernates  through Official tutorials:

    Some useful online training is:

    Azure Cosmos DB name changes

    April 17, 2018 Azure, CosmosDB, Document DB, Emerging Technologies, Microsoft, Windows Azure Development No comments

    An update from Microsoft Azure says that – As part of the transition from Azure DocumentDB to Azure Cosmos DB, the service and resource names are changing from “Azure DocumentDB” to “Azure Cosmos DB” on June 1, 2018.

    How does that Impact?

    When Microsoft introduced Cosmos DB, then have ensured that there was a smooth transition or migration of existing Document DB customers /tenants to Cosmos DB. This was achieved by without changing underlying service and resource names from “Document DB” to “Cosmos DB”.

    So, if you were an existing customer of Document DB, you have noticed the only disappearance of Document DB name and old service showing simply Cosmos DB. You did not feel much difference apart from some additional configuration options as part of multi-modal data source configuration.

    Your ARM deployment templates might need some changes in resource sizing, resource location, and some other configuration aspects.

    There is no a pricing impact because of this change, but you will have to modify billing parameters that rely on the new names. Now with this deadline what Microsoft intends to have is to deprecate the use of Old DocumentDB naming and start migrating all customers/tenants to follow the new naming for the resource billing/sizing purposes.

    To read more about the naming changes: https://azure.microsoft.com/en-us/updates/name-changes-cosmos-db/