.NET

Azure Cosmos DB – Change feed support(PREVIEW)–available

July 26, 2018 .NET, .NET Core, .NET Framework, Azure, Azure Cosmos DB, C#.NET, CosmosDB, Microsoft No comments

Today Microsoft announced the preview of Change feed Support for Azure Cosmos DB, which allows you to build scalable solutions. By default change feed will be enabled in all the accounts.

Change feed provides an output of sorted list of documents that has been changed in the order in which they are modified by client operations. These changes are persisted, can be processed asynchronously and incrementally, they enables developers to write alternative logic to operate upon these change for generation reports, or invoke another operation such as sending email or audit logs etc.

ChangeFeedProcessor

Start using :

Source:

Introduction to NDepend : Static Code Analysis Tool

June 16, 2018 .NET, .NET Core, .NET Framework, ASP.NET, Best Practices, C#.NET, Code Analysis, Code Quality, Dynamic Analysis, Emerging Technologies, Help Articles, Microsoft, Static Analysis, Tech-Trends, Tools, Tools, Visual Studio 2017, VisualStudio, Windows No comments , , , , , ,

As a developer, you always have to take the pain of getting adapted to the best practices and coding guidelines to be followed as per the organizational or industrial standards.  Easy way to ensure your coding style follows certain standard is to manually analyze your code or use a static code analyzer like FxCop, StyleCop etc. Earlier days I have been a fan of FxCop as it was free and it provides me all necessary general guidelines in terms  of improving my solution.

In this modern world of programming everything needs to be automated, as it saves time and money in terms of automating repetitive tasks and improves efficiency. This is where static code analysers coming effective.

What is Static Code Analysis?

Static program analysis is the analysis of computer software that is performed without actually executing programs, on some version of the program source code, and in the other cases, some form of the object code or intermediate compiled code .

Sophistication of static program analysis increases is based on how deep they analyze in terms of behavior of individual statements and declarations, to analyzing the entire source code.

PS: Analysis performed on executing programs is known as dynamic analysis.

In this article I will give you an overview of one such premier static code analysis tool that can be used for your daily development routine plus use it for CI integration for DevOps efficiency.

NDepend:

NDepend is a static analysis tool for .NET, specifically for managed code:  NDepdend supports a large number of code metrics, allowing to visualize dependencies using directed graphs and dependency matrix. It also performs code base snapshots comparisons, and validation of architectural and quality rules.

The important capabilities of NDepend are:

  • Dependency Visualization through dependency matrix and graphs.
  • Analyse and generate software quality metrics – as per the documentation it supports 82 quality metrices.
  • Declarative rule support through LINQ queries, and it is called CQLinq and comes with a large number of predefined CQLinq rules.
  • Integration support for Cruise Control.Net, SonarCube, am City. Code rules can be configured to be checked automatically in Visual Studio or during continuous integration(CI).

License: NDepend is a commercial tool with licensing options as below:

  1. Developer seats – $477 approx. / per seat.
  2. Build Machine seats  – $955 approx. / per seat.

** You could get volume discount if you bulk procure your licenses.

Installation: 

Once you obtained license you will able to download NDepend_2018.1.1.9041.zip, is latest version available while I write this article. Extract the zip file into your local folder, you could see the different packages/executables within the package.

image

1.) NDepend.Console    – Command line program to execute NDepend analysis.  You would be mostly using this component on CI Build server Help

2.) NDepend.PowerTools –  Helps write your own static analyzer based on NDepend.API, or tweak existing open-source Power Tools. Help

image

3.) NDepend.VisualStudioExtension.Installer – To install NDepend extension as part of Visual studio

image

4.) VisualNDepend – Independent visual environment for managing your NDepend tasks.

image

Visual Tool gives you different options to choose from:

  • You can analyse a Visual Studio Solution or project.
  • Analyse .NET assemblies in a folder.

image

image

image

For the demo purpose our analysis target would be one of the starter project from github –  ContosoUniversity by @alimon808.

image

image

Demo: Summary Report

image

Demo: Application Metrics

image

Demo: Dependency Dashboard:

image

Demo: Interactive Graph

image

Demo: Code Matrix View

image

Demo: Quality Gates Summary

image

Demo: Rules Summary

image

Conclusion:

NDepend is one of the best enterprise grade commercial static analyser seen so far.  There are Visual Studio Code Analysis, FxCop and Stylecop Analyzer tools available but they do not provide extensive level of analysis reports NDepend provides. Being a commercial tool it gives value for money for customers by what they need.  In terms of a day to day developer  or devops lifecycle, you can integrate NDepend in your build process, which could be simple as executing the NDepend Console and reviewing the output. With NDepend’s API it is easy to develop your own custom analysis tools based on CQLinq and NDepend.PowerTools(which is open source). You could find all the detailed help in NDepend documentation.

References:

Node.js 9.x.x and npm 6.x.x – “npm audit” to identify and fix security vulnerabilities in dependencies

June 3, 2018 JavaScript, Javascript Development, Modern Web Development, Node.js, NPM, OpenSource, Package Manager, Tech Newz, TypeScript, Web No comments

nodejs-npm

It has been a while I have been reading about the major changes that areintroduced in Node.js 9.x.x / NPM 6.x.x and myself faced by Node.js application going to a toss after I upgraded to Node.js 9.x.x, as I always keep Node.js up to date in my development environment.

I use NVM(Node Virtual Manager) to switch between different version of Node.js and I love the flexiblity NVM provides. So I was able to quickly switch back to 8.x.x version, when I figured out this change.

But npm packgage downgrade did not work using “npm install –g npm@5.x.x” due to old traces of 6.x.x   I had to clean up my npm cache and do npm install again.

Introduction – The “npm audit” command:

Recently with 6.0.0 NPM team has introduced many improvements such as :

a.) Provide protection against insecure code into the workflow during your npm install . When a user downloads code from the npm Registry, npm will review the request against the Node Security Platform database and return a warning if the code contains a vulnerability.

b.) Package signing for publishers.   npm-signature field will allow users of npm packages to verify the integrity of the package regardless of the tools they use to retrieve it or the registry from which they download it.

c.) Security auditing capability (which I am covering in this article).

The audit capability, which provides an ability to perform a security audit  on your project and dependency components.  To simplify it provides a moment-in-time security review of your project’s dependency tree.

  • It will scan your project for any vulnerabilities. 
  • You can choose the option to automatically install the compatible updates vulnerable dependencies.
  • Audit reports contain information about security vulnerabilities in your dependencies.
  • This report also contains necessary steps to be taken to fixe these vulnerability. For example, by running an npm install <package>@new-version.
  • It would work very well with your private/enterprise registries such as artifactory etc. 
  • It  will allow the developer to recursively analyze trees of dependent code to identify specifically what’s insecure.

The audit command submits a description of the dependencies configured in your project to your default registry and asks for a report of known vulnerabilities.

Quick Insight on the new commands:

  • npm audit      – Scan your project for vulnerabilities and just show the details, without fixing anything.
  • npm audit [—json]      – To provide report in Json format.
  • npm audit fix   – to scan and fix all vulnerabilities
  • npm audit fix –only=prod     – to skip updating devDependecies
  • npm audit fix –force  – will install semver-major updates to all top level dependencies.
  • npm audit fix –dry-run –json   – to do a dry run on the fixes and provide you a report.

NB: Npm audit fix runs a full  npm install under the hood, all configs that apply to the “npm audit fix”  will also apply to npm install.

References:

Azure Cosmos DB – Programatically Connect to a preferred location using the SQL API

May 29, 2018 .NET, Azure, CosmosDB, Microsoft, VisualStudio, Windows, Windows Azure Development No comments ,

Cosmos Db is a multi-region scallable, globally-distributed database solution as part of Microsoft Azure Platform.  With a button click, Azure Cosmos DB enables you to elastically and independently scale throughput and storage across any number of Azure’s geographic regions. It offers throughput, latency, availability, and consistency guarantees with comprehensive service level agreements (SLAs),  that no other database service can offer. [REF]

What is multi-region scalability or global distribution ?

What it means is that once you select this option, and underlying platform will ensure that your main database is replicated across other global regions you have defined.

So when a customer/application requests the data from a certain geo location:

  1. Cosmos Db will serve the data from nearest available regional copy to provide low latency in accessing the database.  Inorder to achieve it is recommended to deploy both the application and Azure Cosmos DB in the regions that correspond.
  2. Incase that nearest available region is not defined, it would serve from nearest available or main copy. This could be East US or West US depending on your deployment decisions.
  3. As BCDR(Business Continuity and Disaster Recovery) plan, Incase main copy is not available, it would faillover to serve the requests from any backup region.  

Benefits?

  • Ensured AVAILABILITY @ 99.99% – Azure Cosmos DB offers low latency reads and writes at the 99th percentile worldwide.
  • Faster READS: It ensures that all reads are served from the closest (local) region.  To serve a read request, the quorum local to the region in which the read is issued is used.
  • Reliable WRITES: The same applies to writes. A write is acknowledged only after a majority of replicas have durably committed the write locally but without being gated on remote replicas to acknowledge the writes.

PS: The replication protocol of Azure Cosmos DB operates under the assumption that the read and write quorums are always local to the region where the request has been issued.

How to turn on – Cosmos Db and multi-region replication?

In CosmosDb instance settings select Replicate data globally page, then select the regions to add or remove by clicking regions in the map.

Azure Cosmos DB enables you to configure the regions (associated with the database) for “read”, “write” or “read/write” regions.

image

image 

image

Then configure Manual/Automatic failover options as well. image I would cover this in later articles.

All that said, you are in good hands of Azure Platform as a  Cosmos Db customer or user. 

NB: For the purpose of this article, I have configured my instance to run different regions with write region as East US and read region as West Europe,North Europe and West US.

image

Programatically Connect to a preferred location using the SQL API:

Now coming to the context of this blog, as a application developer some times you would like to programatically control the access to these regions while using Cosmos Db .NET SQL API. 

In CosmosDb.NET SDK version 1.8 and later, there is the ConnectionPolicy parameter for the DocumentClient constructor has a property called Microsoft.Azure.Documents.ConnectionPolicy.PreferredLocations

  • All reads will be sent to the first available region in the PreferredLocations list. If the request fails, the client will fail down the list to the next region, and so on.
  • SDK will automatically send all writes to the current write region.
  • SDK will only attempt to read from the regions specified in PreferredLocations.
  • For example: If you have 4 read regions defined in your cosmos Db instance and you only have 2 regions defined in PreferredLocations in connectionPolicy, requests from other two regions would never be served from SDK.

NB: The client application can verify the current write endpoint and read endpoint chosen by the SDK by checking two properties, WriteEndpoint and ReadEndpoint. **SDK version 1.8+.

Following code snippet would make it easiter to implement:

 
   //Setting read region selection preference. 
   connectionPolicy.PreferredLocations.Add(LocationNames.EastUS); // applications first preference
   connectionPolicy.PreferredLocations.Add(LocationNames.WestEurope); // applications second preference

Full Source Code: https://github.com/AzureContrib/CosmosDB-DotNet-Quickstart-Preferred-Location 

References:

Azure Cosmos DB – Connection Policy – Setting Connection Mode and Connection Protocol

May 13, 2018 .NET, Azure, CosmosDB, Microsoft, PaaS, VisualStudio, Windows, Windows Azure Development No comments , , ,

Recently I have been trying multiple ways to optimize CosmosDb SQL.NET SDK integration calls from my web application that sits within a VNET.

After carefully analyzing different options available within Cosmos Db SQL API’s have realized there are different aspects we could optimize in achieving minimal turn around time. In this article I am going to discuss about one such useful find, that is to use Cosmos Db SQL SDK connection policy to use diferent networking options to improve the latency between web application and cosmos db API calls.

Connection Policy:

Performance of an client application has important implication based on – how SQL .NET SDK  connects to Azure Cosmos DB , because of expected client-side latency due to networking conditions. There are two key configuration settings available for configuring client Connection Policy – the connection mode and the connection protocol.

There are two connection mode options provides by Cosmos Db SQL.NET SDK:

  • Gateway Mode(which is default): This mode is the default option being used and works with all Cosmos DB SDK versions.  Since it is only accessible over HTTPS/TCP, it is more secure and best choice for applications that run on a constrained secure corporate network. If you are using the .NET Framework version of the CosmosDb SQL.NET SDK, then proably this is the only connection mode that would work for you. 

  • Connection Protocol – TCP:  443 is the CosmosDb port, 10255 is the MongoDB API port.   
  • Connection Protocol – HTTPS: Default 443
  • Direct Mode:  This is a new mode which will work only on .NET Standard 2.0 onwards. It provides you an ability to choose between TCP or HTTPS more efficiently.  Only caveat is that you would need .NET Standard 2.0 as target framework for your client application.
    • Connection Protocol – TCP: TCP would be more faster when client and db are in same VNET.  Since TCP within the same network would be more faster, you would be amazed by the latency improvements by your client application. It would respond faster to you cosmos Db requests.  NB In TCP mode apart from 443 and 10255 mentioned in Gateway more, we also need to ensure  port range between 10000 and 20000 is open in your firewall configuration,  because Azure Cosmos DB uses dynamic TCP ports.
    • Connection Protocol – HTTPS: Since client application and cosmosDb are in same network limits, you could see that HTTPS option is also a reliable, secure and faster access channel for you, but not highly performing as TCP.

    A simplified diagram below :

    image

    Sample Code:

     string cosmosDbEndpoint = new Uri("https://mycosmosDbinstance.documents.net");
     string authKey ="cosmosDb-apiKey";
     DocumentClient client = new DocumentClient(cosmosDbEndpoint, authKey,
     new ConnectionPolicy
     {
        ConnectionMode = ConnectionMode.Direct,
        ConnectionProtocol = Protocol.Tcp
     });
     

    Refer more :

    You can find the completed sample here: AzureContrib/CosmosDB-DotNet-Quickstart-With-ConnectionPolicy

    [NPM Tip] Error: self signed certificate in certificate chain

    May 10, 2018 JavaScript, Javascript Development, OpenSource, TypeScript, Web, Web Development No comments

    As a developer, if you are behind a corporate proxy that assigns an intermediatory self signed SSL certificate to every request to provide secure content filtering as part of cybersecurity measures, I am sure you might have gone through the pain to get it working when working with NodeJS.

    if you have Admin access to your windows machine, you could simply try the following fix:

      • Simply Add an Environment Variable
    Environment Variable Name: NODE_TLS_REJECT_UNAUTHORIZED, Value: 0
    

    image

    image

    Hope that solves your problem.