.NET Core

Azure Cosmos DB – TTL (Time to Live) – Reference Usecase

October 9, 2018 .NET, .NET Core, .NET Framework, Analytics, Architecture, Azure, Azure, Azure Cosmos DB, Azure Functions, Azure IoT Suite, Cloud Computing, Cold Path Analytics, CosmosDB, Emerging Technologies, Hot Path Analytics, Intelligent Cloud, Intelligent Edge, IoT Edge, IoT Hub, Microsoft, Realtime Analytics, Visual Studio 2017, VisualStudio, VS2017, Windows No comments

TTL capability within Azure Cosmos DB is a live saver, as it would take necessary steps to purge redudent data based on the configurations you may. 

Let us think in terms of an Industrial IoT scenario, devices can produce vast amounts of telemetry information, logs and user session information that is only useful until we operate on them and take action on them, to be specific up to finate period of time. Once that data becomes surplus, we need an application logic that purges these old records.

With the “Time to Live” or TTL, Microsoft Cosmos DB provides an ability to have your documents automatically purged from database storage after a certian period if time(which you configured)

  • This TTL by default can be set on a document collection level and later can be overridden on a per document basis.
  • Once the TTL is set, Cosmos DB service will automatically remove the documents when its lifetime is over.
  • Inorder to track TTL, Cosmos DB uses an offset field to check when it was last modified.  This field is identifiable as “_ts”, which exists in every document you create.  Basically it is a UNIX epoch timestamp. This field is updated everytime when the document is modified. [Ref: Picture1]

image

[Picture1]

Enabling TTL on Cosmos DB Collection:

You can enable TTL on a Cosmos DB collection simply by using Azure Portal –> Cosmos DB collection setting for existing or during creation of  a new collection)

TTL value needs to be set in seconds – if you need 90 days => 60 sec * 60 min * 24 hour * 90 days = 7776000 seconds

image

[Picture2]

Below is a one of the reference architecture in which Cosmos DB – TTL would be essentially useful and viable to any Iot business case:

image

[Picture3]

Hope that was helpful to get some understanding. For more references visit:  Cosmos DB Documentation

Azure Cosmos DB–Multi Master

October 8, 2018 .NET, .NET Core, .NET Framework, ASP.NET, Azure, Azure CLI, Azure Cosmos DB, CosmosDB, Data Consistancy, Data Integrity, Microsoft, Multi-master, Performance, Reliability, Resilliancy, Scalability, Scale Up No comments

During the Ignite 2018, Microsoft has announced the general availability of Multi-Master feature being introduced to Azure Cosmos DB to provide more control into data redundancy and elastic scalability for your data from different regions with multiple writes and read instances.

What is Multi-Master essentially?

Multi-master is a capability that provided as part of Cosmos DB, that would provide you multiple write regions and provides an option to handle conflict resolution automatically through different options provided by the platform. Most of the major scenarios you would encounter the conflict can be resolved with these simple configurations.

A sample diagram depicting a use case of load balanced web app writing to respective regional master:-

image

With multi-master, Azure Cosmos DB delivers a single digit millisecond write latency at the 99th percentile anywhere in the world, and now offers 99.999 percent write availability (in addition to 99.999 percent read availability) backed by the industry-leading SLAs.

image

Wow! That’s an amazing performance Cosmos DB guarantees to provide so that your mission-critical systems will have zero downtime, if they start using Cosmos DB.

 

How to Enabled Multi-Master support in your Cosmos DB solutions?

Currently multi-master can only be enabled for new Cosmos DB instances using “Enable Multi-Master” option in Azure Portal or through PowerShell or ARM templates or through SDK.

These options are detailed below with necessary examples:

1.) Azure Portal – Enable Multi-region writes and Enable geo-redundancy

image

2.) Azure CLI 
Set the “enable-multiple-write-locations” parameter to “true”

az cosmosdb create \
   –-name "thingx-cosmosdb-dev" \
   --resource-group "consmosify-dev" \
   --default-consistency-level "Session" \
   --enable-automatic-failover "true" \
   --locations "EastUS=0" "WestUS=1" \
   --enable-multiple-write-locations true \

3.) AzureRM PowerShell
In AzureRM PowerShell cmdlet – Set enableMultipleWriteLocations parameter to “true”

$locations = @(@{"locationName"="East US"; "failoverPriority"=0},
             @{"locationName"="West US"; "failoverPriority"=1})

$iprangefilter = ""

$consistencyPolicy = @{"defaultConsistencyLevel"="Session";
                       "maxIntervalInSeconds"= "10";
                       "maxStalenessPrefix"="200"}

$CosmosDBProperties = @{"databaseAccountOfferType"="Standard";
                        "locations"=$locations;
                        "consistencyPolicy"=$consistencyPolicy;
                        "ipRangeFilter"=$iprangefilter;
                        "enableMultipleWriteLocations"="true"}

New-AzureRmResource -ResourceType "Microsoft.DocumentDb/databaseAccounts" `
  -ApiVersion "2015-04-08" `
  -ResourceGroupName "consmosify-dev" `
  -Location "East US" `
  -Name "thingx-cosmosdb-dev" `
  -Properties $CosmosDBProperties

4.) Through CosmosDB SDK
Setting connection policy in DocumentDBClient and set UseMultipleWriteLocations to true.

ConnectionPolicy policy = new ConnectionPolicy
{
   ConnectionMode = ConnectionMode.Direct,
   ConnectionProtocol = Protocol.Tcp,
   UseMultipleWriteLocations = true,
};
policy.PreferredLocations.Add("East US");
policy.PreferredLocations.Add("West US");
policy.PreferredLocations.Add("West Europe");
policy.PreferredLocations.Add("North Europe");
policy.PreferredLocations.Add("Southeast Asia");
policy.PreferredLocations.Add("Japan East");
policy.PreferredLocations.Add("Japan West");

Azure Cosmos DB multi-master configuration is the game changes that really makes it a true global scale database with automatic conflict resolution capabilities for data synchronization and consistancy.

In my later sessions I will write examples to cover how conflict resolutions can be configured and used in realtime scenarios.

Useful Refs:

NDepend–VSTS/Azure DevOps Integration–Part 01

September 30, 2018 .NET, .NET Core, .NET Framework, Azure DevOps, Best Practices, Code Analysis, Code Quality, Dynamic Analysis, Emerging Technologies, Microsoft, Static Analysis, Tools No comments

In my previous article I wrote an introductory about NDepend and how it will be useful for Agile Team to ensure code quality.

In that article we found how we can use NDepend in a developer machine. Now with this article we will familiarize ourselves in using NDepend in your build automation pipeline in your VSTS/Azure DevOps Build Agent.

There are two types of integration possible for NDepend:

  1. Directly using NDepend Package Extension from VSTS Marketplace
  2. Manual Integration using NDepend Command Line Tool. (This would provide you more control over licensing by setting up the license in your own OnPrem VSTS Build Agent.

For the interest of this article I will cover the use of VSTS Package Extension and using NDepend Build Task in VSTS Build Pipeline.

Installation of NDepend Extension for VSTS/Azure DevOps :

1.) Got to Azure DevOps Market Place:  https://marketplace.visualstudio.com/items?itemName=ndepend.ndependextension

image

2.) Click on Get to Install this extension in to your AzureDevOps account and follow the steps. For the demo purpose I am starting with 30 day free trial, otherwise you can go ahead and buy the full license.

image

image

image

3.) Now when you get back to Azure DevOps project, you can see the NDepend side menu enabled, this is where you would see the report summary of your project.

image

Integration NDepend into Azure DevOps Pipeline :

1.) Select “NDepend Task” and add in to Pipeline

image

image

Note:

  • You can choose to stop the build when at least one quality gate fails.
  • You also need to specify the NDepend project file customized for your project, otherwise NDepend will use their default project file configuration.  Having your own NDepend project file will provide you more control over the policies for the scan.

Queue a new Build and wait for Build to complete. Now you can see the BuildArtifacts includes all NDepend report file.

image

Now you go back to NDepend menu from Left side menu item in Summary Tab. This will provide you detailed view of Technical Debt in your project.

image

image

image

image

image

In the next article I will cover the manual integration steps.

Azure Cosmos DB – Change feed support(PREVIEW)–available

July 26, 2018 .NET, .NET Core, .NET Framework, Azure, Azure Cosmos DB, C#.NET, CosmosDB, Microsoft No comments

Today Microsoft announced the preview of Change feed Support for Azure Cosmos DB, which allows you to build scalable solutions. By default change feed will be enabled in all the accounts.

Change feed provides an output of sorted list of documents that has been changed in the order in which they are modified by client operations. These changes are persisted, can be processed asynchronously and incrementally, they enables developers to write alternative logic to operate upon these change for generation reports, or invoke another operation such as sending email or audit logs etc.

ChangeFeedProcessor

Start using :

Source:

Introduction to NDepend : Static Code Analysis Tool

June 16, 2018 .NET, .NET Core, .NET Framework, ASP.NET, Best Practices, C#.NET, Code Analysis, Code Quality, Dynamic Analysis, Emerging Technologies, Help Articles, Microsoft, Static Analysis, Tech-Trends, Tools, Tools, Visual Studio 2017, VisualStudio, Windows No comments , , , , , ,

As a developer, you always have to take the pain of getting adapted to the best practices and coding guidelines to be followed as per the organizational or industrial standards.  Easy way to ensure your coding style follows certain standard is to manually analyze your code or use a static code analyzer like FxCop, StyleCop etc. Earlier days I have been a fan of FxCop as it was free and it provides me all necessary general guidelines in terms  of improving my solution.

In this modern world of programming everything needs to be automated, as it saves time and money in terms of automating repetitive tasks and improves efficiency. This is where static code analysers coming effective.

What is Static Code Analysis?

Static program analysis is the analysis of computer software that is performed without actually executing programs, on some version of the program source code, and in the other cases, some form of the object code or intermediate compiled code .

Sophistication of static program analysis increases is based on how deep they analyze in terms of behavior of individual statements and declarations, to analyzing the entire source code.

PS: Analysis performed on executing programs is known as dynamic analysis.

In this article I will give you an overview of one such premier static code analysis tool that can be used for your daily development routine plus use it for CI integration for DevOps efficiency.

NDepend:

NDepend is a static analysis tool for .NET, specifically for managed code:  NDepdend supports a large number of code metrics, allowing to visualize dependencies using directed graphs and dependency matrix. It also performs code base snapshots comparisons, and validation of architectural and quality rules.

The important capabilities of NDepend are:

  • Dependency Visualization through dependency matrix and graphs.
  • Analyse and generate software quality metrics – as per the documentation it supports 82 quality metrices.
  • Declarative rule support through LINQ queries, and it is called CQLinq and comes with a large number of predefined CQLinq rules.
  • Integration support for Cruise Control.Net, SonarCube, am City. Code rules can be configured to be checked automatically in Visual Studio or during continuous integration(CI).

License: NDepend is a commercial tool with licensing options as below:

  1. Developer seats – $477 approx. / per seat.
  2. Build Machine seats  – $955 approx. / per seat.

** You could get volume discount if you bulk procure your licenses.

Installation: 

Once you obtained license you will able to download NDepend_2018.1.1.9041.zip, is latest version available while I write this article. Extract the zip file into your local folder, you could see the different packages/executables within the package.

image

1.) NDepend.Console    – Command line program to execute NDepend analysis.  You would be mostly using this component on CI Build server Help

2.) NDepend.PowerTools –  Helps write your own static analyzer based on NDepend.API, or tweak existing open-source Power Tools. Help

image

3.) NDepend.VisualStudioExtension.Installer – To install NDepend extension as part of Visual studio

image

4.) VisualNDepend – Independent visual environment for managing your NDepend tasks.

image

Visual Tool gives you different options to choose from:

  • You can analyse a Visual Studio Solution or project.
  • Analyse .NET assemblies in a folder.

image

image

image

For the demo purpose our analysis target would be one of the starter project from github –  ContosoUniversity by @alimon808.

image

image

Demo: Summary Report

image

Demo: Application Metrics

image

Demo: Dependency Dashboard:

image

Demo: Interactive Graph

image

Demo: Code Matrix View

image

Demo: Quality Gates Summary

image

Demo: Rules Summary

image

Conclusion:

NDepend is one of the best enterprise grade commercial static analyser seen so far.  There are Visual Studio Code Analysis, FxCop and Stylecop Analyzer tools available but they do not provide extensive level of analysis reports NDepend provides. Being a commercial tool it gives value for money for customers by what they need.  In terms of a day to day developer  or devops lifecycle, you can integrate NDepend in your build process, which could be simple as executing the NDepend Console and reviewing the output. With NDepend’s API it is easy to develop your own custom analysis tools based on CQLinq and NDepend.PowerTools(which is open source). You could find all the detailed help in NDepend documentation.

References:

Blazer – The new experimental web framework from Microsoft

May 2, 2018 .NET, .NET Core, .NET Core 2.0, C#.NET, Emerging Technologies, Microsoft, Razor No comments

In this world of multiple Web frameworks Microsoft would not want to stop experimenting with new frameworks for Web development. Innovation is a key to Microsoft, doesn’t matter the start later than the React(Facebook) and Angular(Google) , but Microsoft has proven most of the times they are good in developing cutting edge frameworks.  That’s how Blazer has born.

Blazer = Browser + Razer

As a ASP.net MVC developer I always loved Razer syntax that was shipped with ASP.NET MVC 3.0. Since then Microsoft has improved the Razor framework with async/await patterns and fluent syntaxes etc.

Concept is simple, use .NET for building browser based apps. Your familiar C# and Razor syntax can add lots of improvements in the way you build browser apps as a modern day web developer.

Why use .NET?

To simplify this question, quoting an excerpt  from Microsoft ASP.NET team blog:  “Web development has improved in many ways over the years but building modern web applications still poses challenges. Using .NET in the browser offers many advantages that can help make web development easier and more productive:

  • Stable and consistent: .NET offers standard APIs, tools, and build infrastructure across all .NET platforms that are stable, feature rich, and easy to use.
  • Modern innovative languages: .NET languages like C# and F# make programming a joy and keep getting better with innovative new language features.
  • Industry leading tools: The Visual Studio product family provides a great .NET development experience on Windows, Linux, and macOS.
  • Fast and scalable: .NET has a long history of performance, reliability, and security for web development on the server. Using .NET as a full-stack solution makes it easier to build fast, reliable and secure applications.

Blazor will have all the features of a modern web framework including:

  • A component model for building composable UI
  • Routing
  • Layouts
  • Forms and validation
  • Dependency injection
  • JavaScript interop
  • Live reloading in the browser during development
  • Server-side rendering
  • Full .NET debugging both in browsers and in the IDE
  • Rich IntelliSense and tooling
  • Ability to run on older (non-WebAssembly) browsers via asm.js
  • Publishing and app size trimming

Now the usual question arises? How is that possible? Running .NET in a Browser?

It is all started with WebAssembly, a new web standard for a “portable, size- and load-time-efficient format suitable for compilation to the web.

  • WebAssembly enables fundamentally new ways to write web apps. Code compiled to WebAssembly can run in any browser at native speeds.
  • WebAssembly is the foundational framework needed to build a .NET runtime that can run in the browser.
  • No plugins or extensions required.

Getting Started with Blazer:

Latest version of blazer framework available is 0.3.0 released on 02/05/2018.

Steps to setup Blazor 0.3.0:

  1. Install the .NET Core 2.1 SDK (2.1.300-preview2-008533 or later).
  2. Install Visual Studio 2017 (15.7 Preview 5 or later) with the ASP.NET and web development workload selected.
  3. Install the latest Blazor Language Services extension from the Visual Studio Marketplace.

Install the Blazor templates using command-line:

dotnet new -i Microsoft.AspNetCore.Blazor.Templates

Additional References: