Welcome to the World of Tomorrow:

We are in an emerging world of technologies and innovation, where IoT cannot stand alone.

The Internet-of-Things(a.k.a IoT) provides us with lots of sensor data/information. However, this sensor information by themselves does not add value unless we can turn them into actionable, contextualized information useful for mankind. Big data, data visualization and Machine Learning(ML) techniques allow us to gain new insights by learning, batch-processing and information analysis(online/off-line). Real-time sensor data analysis and decision-making are often done manually but to make it scalable, it is preferably automated.

Artificial Intelligence(AI) provides us the set of framework and tools to go beyond typical real-time decision and automation use cases for IoT. Who knows we may end up in a world where humans have less work to do like in Sci-Fi Hollywood movies. Whatever happens will be the best towards for our next generations and we have the responsiblity to build a better world for them.

The intent of this blog is to provide you latest insight into what is happening in the Cloud, IoT & AI industry. Keep reading, keep contributing and please share as well.

Getting Started with Azure Functions App

In my previous article I gave you an overview of Azure Functions and discussed about the benefits of the Azure Functions. With this session I will cover you with necessary steps to create an initial basic functions app.

Getting Started:

Login to Azure Portal, you will see Function Apps section in the left menu. This is where all your Function Apps will be listed, once you login.

imageimage

Let us start by creating a new Function app. Type Functions

image

Select Function App from Web + Mobile category, and Click on Create Button.

Fill In details:

  1. App Name,
  2. Select Subscription
  3. Select Resource Group (new if you want to create new resource group) or select existing
  4. Select Hosting Plan
  5. Specify Storage
  6. Click on Create

imageimage

You will see deployment in progress message.

imageimage

image

Once you explore further selecting the Function App instance you will be able view the URL and left side  menu you will see the options to configure:

  1. Functions
  2. Proxies (Preview Feature)
  3. Slots (Preview Feature)

image

Getting Started – create first Function App

Now since we have new instance ready. Let us create our first Function.

We have to choose by :

1.)  Choose a scenario:

  1. Webhook + API
  2. Time
  3. Data Processing

2.)  Choose a language:

  1. JavaScript
  2. CSharp
  3. FSharp
  4. For PowerShell, Python and Batch processing, you create your own custom function

image

For the demo sake I am creating a Timer Scenario and selected CSharp as the language.

image

I created a simple trigger code  and Click on Save and Run

image

Job has completed within speculated delay we put through on the Thread.Sleep:

image

Code Sample:

 
using System;
using System.Threading;

public static void Run(TimerInfo myTimer, TraceWriter log)
{
    log.Info($"C# Timer trigger function executing at: {DateTime.Now}");

    RunTest(log);

    log.Info($"C# Timer trigger function completed at: {DateTime.Now}");
}


public static void RunTest(TraceWriter log)
{
    for(int i=0; i< 100; i++)
    {
       log.Info($"C# Timer trigger function executing at thread: {i}"); 
       
        Thread.Sleep(1000);

        log.Info($"C# Timer trigger function completed at thread: {i}"); 
    }
}

Using the Functions –> Integrate section we can configure Input, Output parameters and Schedule Timers, to make it available as a WebAPI methods. You can call this functional logic from another application to invoke as a web API call by passing necessary inputs, to start another functional process.

One example for this scenario would be to invoke a Database record archival  call after completion of an order. This is will be applicable in case we choose the scenario WebHook + API during the creation of your functional logic.

image

That’s all for now for this topic.  I will cover more details about WebHook + API in next series.

Please share your comments and rate this article to help me understand areas of improvement.

Additional Refs:

Azure Functions App–Run OnDemand Serverless code – a path way to Serverless Computing

Azure Functions is a new cloud solution from Azure that would let you execute small pieces code or “functions” in the cloud.  This means you do not have to worry about the infrastructure or environment to execute your little piece of code to solve any of your business problems.

functions-logo

Functions can make development even more productive, and you can use your development language of choice.

Benefits:

  • Pay only for the time your code runs and trust Azure to scale as needed.
  • Azure Functions lets you develop serveries applications on Microsoft Azure.
  • Supports wide variety of development language choices , such as C#, F#, Node.js, Python or PHP.
  • Bring your own dependencies – you can bring any of your Nuget/NPM dependencies for your functional logic.

What can we do with Azure Functions?

Azure Functions is a very good  solution for processing data, integrating systems, working with the internet-of-things (IoT), and building simple APIs and micro services.

Functions provides templates to help you  get started with some useful scenarios, including the following:

  • BlobTrigger – Process Azure Storage blobs when they are added to containers. You might use this function for image resizing.
  • EventHubTrigger – Respond to events delivered to an Azure Event Hub. Particularly useful in application instrumentation, user experience or workflow processing, and Internet of Things (IoT) scenarios.
  • Generic Webhook – Process webhook HTTP requests from any service that supports webhooks.
  • GitHub Webhook – Respond to events that occur in your GitHub repositories.
  • HTTPTrigger – Trigger the execution of your code by using an HTTP request.
  • QueueTrigger – Respond to messages as they arrive in an Azure Storage queue.
  • ServiceBusQueueTrigger – Connect your code to other Azure services or on-premises services by listening to message queues.
  • ServiceBusTopicTrigger – Connect your code to other Azure services or on-premises services by subscribing to topics.
  • TimerTrigger – Execute cleanup or other batch tasks on a predefined schedule.

Integration Support with other Azure Services:

Following are the services integration supported by Azure Functions app.

  • Azure Cosmos DB
  • Azure Event Hubs
  • Azure Mobile Apps (tables)
  • Azure Notification Hubs
  • Azure Service Bus (queues and topics)
  • Azure Storage (blob, queues, and tables)
  • GitHub (webhooks)
  • On-premises (using Service Bus)
  • Twilio (SMS messages)

Costing:

Azure functions will be charged based on two pricing plans below:

  1. App Service Plan – if you already have an Azure App Service running with Logic, Web, Mobile or Web Job, you can use the same environment for your Azure functions execution without needing to pay for extra resources.  You will be charged based on regular app service rates.
  2. Consumption plan  – with this plan you only need to pay for how long and how many times your functions runs and computational needs/resource usage during that execution time. Consumption plan pricing includes a monthly free grant of 1 million requests and 400,000 GB-s of resource consumption per month.

You can find further pricing related info here

Support and SLA:

  • Free billing and subscription management support
  • Flexible support plans starting at $29/month. Find a plan
  • 99.95% guaranteed up time. Read the SLA

Useful Links:

Managed Azure Database for MySQL and PostgreSQL

During Microsoft Build 2017(May 10th 2017) conference in Seattle, Scott Guthrie (EVP of Cloud and Enterprise Group) announced two new offerings to the Azure Database Services Platform, Azure Database for MySQL and Azure Database for PostgreSQL.

I was happy that Microsoft is filling the gap for the need of Fully Managed MYSQL and PostgreSQL . I recollect around in April I was trying to migrate this WordPress blog from Godaddy hosting in to  an Azure App Service to provide and since WordPress requires MySQL as the database. The only option left for me in Azure was to have local MySQL(MySQL in App)  in App Service, which cannot scale well or either use Clear DB service (a Microsoft partner in azure). Some how I wasn’t happy with the performance of local MYSQL and Clear DB, due to my bulky blog. So I thought what if there was a Managed MYSQL service just like Managed SQL Azure services.

What is Azure Database for MySQL and PostgreSQL?

Azure Database for MYSQL and PostgreSQL(currently in PREVIEW)  are fully managed Platform as a Service(PaaS) offering from Microsoft Azure, which does not want us to worry about infrastructure and managing the server instance.  Below is the outline of how these services has been stacked up against existing SQL Database offerings. As a customer you do not need to worry about the Compute, Storage, Networking, and high-performance/availability/scalability  of these services ensured by Azure Data Service Platform with built in monitoring.

You easily deploy an Azure Web App with Azure Database for MySQL as the database provider, and to provide complete solutions for common Content Management Systems (CMS) such as WordPress and Drupal.

2d1e1ef6-94ac-4110-bc4d-93d0b44d45aa

I will cover more details in later series That’s all for now. Thank you for reading my content. Leave your comments.

Pricing Details:

Useful Links:

Big Data & Front End Development track in the Microsoft Professional Program

Earlier I introduced you the Microsoft Professional Program for Data Science. Right after few days Microsoft announced the BETA availability of two more tracks Big Data and Front End Development.

Big Data Track:

This Microsoft program will help you to learn necessary skills from cloud storage and databases to Hadoop, Spark, and managed data services in Azure. Curriculum of this program involves learning how to build big data solutions for batch and real-time stream processing using Azure managed services and open source systems like Hadoop and Spark.

Are you intend to pursue a Data Analytics career, this is the right program for you to gain necessary insights.

Technology you will apply to gain these skills are: Azure Data Lake, Hadoop, HDInsight, Spark, Azure data factory, Azure Stream Analytics

Below is the course outline :

  • 10 COURSES  |  12-30  HOURS PER COURSE  |  8  SKILLS
  • ENROLL NOW here
  • More details here

Front End Development Track:

This track provides you necessary skills to get started with Advanced Front End development using HTML5, CSS3, JavaScript, AngularJS and Bootstrap.  At the end of the curriculum you will become master in Front End Development with all predominant modern web technologies.

So if you are a front end UI developer, this is something you can try out to enhance your skills.

Below is the course outline :

  • 13 COURSES  |  15-30 HOURS PER COURSE  |  11 SKILLS
  • ENROLL NOW here
  • More details  here

Track detail

Each course runs for three months and starts at the beginning of a quarter. January—March, April—June, July—September, and October —December. The capstone runs for four weeks at the beginning of each quarter: January, April, July, October. For exact dates for the current course run, please refer to the course detail page on edX.org. 

[Microsoft]

Microsoft Professional Program for Data Science

Microsoft has come up with a new program to bring in more skilled people to the field of Data Science by providing them the right training on right set of tools.

Microsoft has put together a curriculum  to teach key functional and technical skills, combining highly rated online courses with hands-on labs, concluding in a final capstone project. All these trainings will be delivered by Microsoft either online or through recorded sessions.

The program comprises of  10 COURSES, 16-32 HOURS PER COURSE,  8 SKILLS

The technology skills you will gain through are: T-SQL, Microsoft Excel, PowerBI, Python, R, Azure Machine Learning, HDInsight, Spark.

ENROLL NOW: through this link

Course schedule:
For exact dates for the course, please refer to the course detail page on edX.org.

For more details on this program: https://academy.microsoft.com/en-us/professional-program/data-science/ 

** This course would provide necessary insight to write Microsoft’s new Certification – Microsoft Certified Solution Associate(MCSA) – Machine Learning.

Happy Learning!!

Introduction to Data Science

We all have been hearing the term Data Science and Data Scientist occupation become more popular these days. I thought of sharing some light into this specific area of science, that may seem interesting for rightly skilled readers of my blog. 

Data Science is one of the hottest topics on the Computer and Internet  nowadays. People/Corporations have gathered data from applications and systems/devices until today and now is the time to analyze them. The world wide adoption of Internet of Things has also added more scope analyzing and operating on the huge data being accumulated from these devices near real-time.

As per the standard Wikipedia definition goes Data science, also known as data-driven science, is an interdisciplinary field about scientific methods, processes and systems to extract knowledge or insights from data in various forms, either structured or unstructured, similar to data mining.”.

Data Science requires the following skillset:

  • Hacking Skills
  • Mathematics and Statistical Knowledge
  • Substantive Scientific Expertise

aoz1BJy

[Image Source: From this article by Berkeley Science Review.]

Data Science Process:

Data Science process involves collecting row data, processing data, cleaning data, data analysis using models/algorithms and visualizes them for presentational approaches.  This process is explained through a visual diagram from Wikipedia.

Data_visualization_process_v1

[Data science process flowchart, source wikipedia]

Who are Data Scientist?

Data scientists use their data and analytical ability to find and interpret rich data sources; manage large amounts of data despite hardware, software, and bandwidth constraints; merge data sources; ensure consistency of datasets; create visualizations to aid in understanding data; build mathematical models using the data; and present and communicate the data insights/findings.

They are often expected to produce answers in days rather than months, work by exploratory analysis and rapid iteration, and to produce and present results with dashboards (displays of current values) rather than papers/reports, as statisticians normally do.

Importance of Data Science and Data Scientist:

“This hot new field promises to revolutionize industries from business to government, health care to academia.”

The New York Times

Data Scientist is the sexiest job in the 21st century as per Harward Business Review.

McKinsey & Company projecting a global excess demand of 1.5 million new data scientists.

What are the skills required for a Data Scientist, let me share you a visualization through a Brain dump.

FxsL3b8

I thought of sharing an image to take you through the essential skill requirements for a Modern Data Scientist.

So what are you waiting for?, if you are rightly skilled get yourselves an Data Science Course.

Informational  Sources: