Cloud and Datacenter Management Blog

Microsoft Hybrid Cloud blogsite about Management


Leave a comment

#Microsoft #AzureDevOps – Azure Pipelines, #Azure Boards + GitHub with @AbelSquidHead #LoECDA

Azure DevOps for CI/CD

Azure DevOps Services is a cloud service for collaborating on code development. It provides an integrated set of features that you access through your web browser or IDE client. The features are included, as follows:

  • Git repositories for source control of your code
  • Build and release services to support continuous integration and delivery of your apps
  • Agile tools to support planning and tracking your work, code defects, and issues using Kanban and Scrum methods
  • Many tools to test your apps, including manual/exploratory testing, load testing, and continuous testing
  • Highly customizable dashboards for sharing progress and trends
  • Built-in wiki for sharing information with your team

The Azure DevOps ecosystem also provides support for adding extensions and integrating with other popular services, such as: Campfire, Slack, Trello, UserVoice, and more, and developing your own custom extensions.

Start your CI/CD Pipelines Today with Azure DevOps

More information about Microsoft Azure DevOps :

Microsoft Azure DevOps Docs

Azure DevOps Community Group on LinkedIn

Azure DevOps PODCAST

and stay up-to-date on Azure DevOps via Twitter :

The #LoECDA Team

@AzureDevOps

@DonovanBrown

@AbelSquidHead

@jldeen

@damovisa

@StevenMurawski


Leave a comment

View Container Live logs with #Azure Monitoring #AKS #Kubernetes #Containers #AzureDevOps

Monitoring Azure Kubernetes Cluster

Azure Monitor for containers is a feature designed to monitor the performance of container workloads deployed to either Azure Container Instances or managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS). Monitoring your containers is critical, especially when you’re running a production cluster, at scale, with multiple applications.
Azure Monitor for containers gives you performance visibility by collecting memory and processor metrics from controllers, nodes, and containers that are available in Kubernetes through the Metrics API. Container logs are also collected. After you enable monitoring from Kubernetes clusters, these metrics and logs are automatically collected for you through a containerized version of the Log Analytics agent for Linux and stored in your Log Analytics workspace.

Here you find awesome documentation about Understanding AKS cluster performance with Azure Monitor for containers

What I really like is that you now can see the Container Live logs from the Azure portal and see what is going on in the background of a Container 🙂

Activate Azure Kubernetes Container Live Logs

Here you see the Container Live logs

This feature provides a real-time view into your Azure Kubernetes Service (AKS) container logs (stdout/stderr) without having to run kubectl commands. When you select this option, new pane appears below the containers performance data table on the Containers view, and it shows live logging generated by the container engine to further assist in troubleshooting issues in real time.
Live logs supports three different methods to control access to the logs:

  1. AKS without Kubernetes RBAC authorization enabled
  2. AKS enabled with Kubernetes RBAC authorization
  3. AKS enabled with Azure Active Directory (AD) SAML based single-sign on

You even can search in the Container Live Logs for Troubleshooting and history :

Search on ssh

Azure Monitor for containers uses a containerized version of the Log Analytics agent for Linux. After initial deployment, there are routine or optional tasks you may need to perform during its lifecycle.
Because of this agent you can work with Log Analytics in Azure Monitor :

Log Analytics on Containers.

Here you find more on Log Analytics query language

Conclusion :

When you have your production workload running on Azure Kubernetes Clusters, It’s important to monitor to keep you in Control of the solution in Microsoft Azure and watch for improvements like performance for the business. With Container Live logs you can see what is going on in the Containers when you have issues and that’s great for troubleshooting to get your problem solved fast. Get your workload into Azure Containers and make your Azure DevOps CI/CD Pipelines in the Cloud.

Join the LinkedIn Community Groups for :

Containers in the Cloud

Azure DevOps Community

Microsoft Azure Monitor & Security for Hybrid IT


Leave a comment

Take a Deep Dive with this SQL Server 2017 Administration Ebook #SQL #SQL2017 #Azure #dba

Introduction

The velocity of change for the Microsoft SQL Server DBA has increased this decade. The span
between the releases of SQL Server 2016 and 2017 was only 16 months, the fastest new release
ever. Gone are the days when DBAs had between three to five years to soak in and adjust to new
features in the engine and surrounding technologies.
This book is written and edited by SQL Server experts with two goals in mind: to deliver a solid
foundational skillset for all of the topics covered in SQL Server configuration and administration,
and also to deliver awareness and functional, practical knowledge for the dramatic number
of new features introduced in SQL Server 2016 and 2017. We haven’t avoided new content—
even content that stretched the boundaries of writing deadlines with late-breaking new releases.
You will be presented with not only the “how” of new features, but also the “why” and the
“when” for their use.

Go Deep Dive with this Awesome SQL Server 2017 Ebook 😉

Download the Custom excerpt Inside Out SQL Server 2017 Administration Ebook here


Leave a comment

Getting started with #Microsoft Azure Cognitive Services in #Containers #Azure #AI #AKS #Docker

Microsoft Visual Studio Code Tools for AI

With container support, customers can use Azure’s intelligent Cognitive Services capabilities, wherever the data resides. This means customers can perform facial recognition, OCR, or text analytics operations without sending their content to the cloud. Their intelligent apps are portable and scale with greater consistency whether they run on the edge or in Azure.

Bringing AI to the Edge via  Corporate Vice President, Azure AI Eric Boyd

Get started with these Azure Cognitive Services Containers

Building solutions with machine learning often requires a data scientist. Azure Cognitive Services enable organizations to take advantage of AI with developers, without requiring a data scientist. We do this by taking the machine learning models and the pipelines and the infrastructure needed to build a model and packaging it up into a Cognitive Service for vision, speech, search, text processing, language understanding, and more. This makes it possible for anyone who can write a program, to now use machine learning to improve an application. However, many enterprises still face challenges building large-scale AI systems. Today Microsoft announced container support for Cognitive Services, making it significantly easier for developers to build ML-driven solutions.

Microsoft got the following Containers :

  • Text Analytics Containers
  • Face Container
  • Recognize Text Container

More information from Director of Program Management Applied AI Lance Olson here

Start with Installing and running Containers

Request access to the private container registry

You must first complete and submit the Cognitive Services Vision Containers Request form to request access to the Face container. The form requests information about you, your company, and the user scenario for which you’ll use the container. Once submitted, the Azure Cognitive Services team reviews the form to ensure that you meet the criteria for access to the private container registry.

Important !

You must use an email address associated with either a Microsoft Account (MSA) or Azure Active Directory (Azure AD) account in the form. If your request is approved, you then receive an email with instructions describing how to obtain your credentials and access the private container registry.

Read more about installing the Containers here

The Face container uses a common configuration framework, so that you can easily configure and manage storage, logging and telemetry, and security settings for your containers.
Configuration settings
Configuration settings in the Face container are hierarchical, and all containers use a shared hierarchy, based on the following top-level structure:

  • ApiKey
  • ApplicationInsights
  • Authentication
  • Billing
  • CloudAI
  • Eula
  • Fluentd
  • Logging
  • Mounts

Read more here about Configuring the Containers

Follow Containers in the Cloud Community Group

 


Leave a comment

via @MSAzureCAT Enterprise #Cloud Control Plane Planning #AzureDevOps #Pipelines

End-to-end Pipelines for Automating Microsoft Azure Deployments

 

Overview :

Imagine a fully automated, end-to-end pipeline for your cloud deployments—one that encompasses and automates everything:

• Source code repos.
• The build and release iterations.
• Agile processes supported by continuous integration and continuous deployment (CI/CD)
• Security and governance.
• Business unit chargebacks.
• Support and maintenance.

Azure services and infrastructure-as-code (IaC) make control plane automation very achievable. Many enterprise IT groups dream of creating or unifying their disparate automation processes and supporting a common, enterprise-wide datacenter control plane in the cloud that is integrated with their existing or new DevOps workflows. Their development environments may use Jenkins, Azure DevOps Services (formerly Visual Studio Team Services), Visual Studio Team Foundation Server (TFS), Atlassian, or other services. The challenge is to automate beyond the CI/CD pipeline to the management and policy layers. From a planning and architecture standpoint, it can seem like an overwhelming program of interdependent systems and processes. This guide outlines a planning process that you can use for automated support of your cloud deployments and DevOps workflows beyond the CI/CD pipeline. The Azure platform provides services you can use, or you can choose to work with third-party or open source options. The process is based on real-world examples that we have deployed with enterprise customers on Azure.

This whitepaper was authored by Tim Ehlen. It was edited by Nanette Ray. It was reviewed by AzureCAT.

Download the Awesome eBook here on the AzureCAT Team Blog

Follow AzureCAT and SQLCAT on Twitter


Leave a comment

Using #Azure Pipelines for your Open Source Project #AzureDevOps

Azure Pipelines for your Open Source Projects

Damian speaks to Edward Thomson about how to get started with Azure Pipelines – right from GitHub. The deep integration and GitHub Marketplace app for Azure Pipelines makes it incredibly easy to build your projects no matter what language you’re using. You can even use the builds as part of your PR checks!

https://github.com/marketplace/azure-pipelines

Edward shows us the incredible (free!) offers for open and closed source projects, and walks through creating and running a new Azure Pipelines build from scratch in only a few minutes.

Subscribe to Azure DevOps on YouTube


Leave a comment

Microsoft #Azure Service Fabric Mesh for your #Microservices and #Container Apps in the #Cloud

Microsoft Service Fabric Mesh

Azure Service Fabric Mesh is a fully managed service that enables developers to deploy microservices applications without managing virtual machines, storage, or networking. Applications hosted on Service Fabric Mesh run and scale without you worrying about the infrastructure powering it. Service Fabric Mesh consists of clusters of thousands of machines. All cluster operations are hidden from the developer. Simply upload your code and specify resources you need, availability requirements, and resource limits. Service Fabric Mesh automatically allocates the infrastructure and handles infrastructure failures, making sure your applications are highly available. You only need to care about the health and responsiveness of your application-not the infrastructure.

With Service Fabric Mesh you can:

  • “Lift and shift” existing applications into containers to modernize and run your current applications at scale.
  • Build and deploy new microservices applications at scale in Azure. Integrate with other Azure services or existing applications running in containers. Each microservice is part of a secure, network isolated application with resource governance policies defined for CPU cores, memory, disk space, and more.
  • Integrate with and extend existing applications without making changes to those applications. Use your own virtual network to connect existing application to the new application.
  • Modernize your existing Cloud Services applications by migrating to Service Fabric Mesh.

Build high-availability into your application architecture by co-locating your compute, storage, networking, and data resources within a zone and replicating in other zones. Azure services that support Availability Zones fall into two categories:

  • Zonal services – you pin the resource to a specific zone (for example, virtual machines, managed disks, IP addresses)
  • Zone-redundant services – platform replicates automatically across zones (for example, zone-redundant storage, SQL Database).

To achieve comprehensive business continuity on Azure, build your application architecture using the combination of Availability Zones with Azure region pairs. You can synchronously replicate your applications and data using Availability Zones within an Azure region for high-availability and asynchronously replicate across Azure regions for disaster recovery protection.

Store state in an Azure Service Fabric Mesh application by mounting an Azure Files based volume inside the container

Twitter AMA on Service Fabric Mesh :

The Service Fabric team will be hosting an Ask Me Anything (AMA) (more like “ask us anything”!) session for Service Fabric Mesh on Twitter on Tuesday, October 30thfrom 9am to 10:30am PST. Tweet to @servicefabric or @AzureSupport using #SFMeshAMA with your questions on Mesh and Service Fabric. More information here

More information about Azure Service Fabric Mesh :

Microsoft Azure Service Fabric Mesh LAB on Github

Get started with Microsoft Azure Service Fabric for your Microservices and Container Apps

Service Fabric Microsoft Ignite 2018 sessions

JOIN Containers in the Cloud Community Group on LinkedIn here


Leave a comment

Make your first Pipeline with Azure DevOps Project in the #Cloud #Azure #AzureDevOps


Start here your Azure DevOps Project in Azure.

Microsoft Azure DevOps Services (Tools) to make your own CI/CD Pipeline in the Cloud

Azure Pipelines is a cloud service that you can use to automatically build and test your code project and make it available to other users. It works with just about any language or project type.
Pipelines combines both Continuous Integration (CI) and Continuous Deployment (CD) to constantly and consistently test and build your code and ship it to any target.

Microsoft made it really easy to make your first Azure DevOps Pipeline in the Cloud.
Here you find a step-by-step guide to make your first Azure pipeline :

When you already made your Cloud application, you can choose option Bring your Own Code 😉

But in this step-by-step guide, I choose for a HTML5 Azure Web App template which is available in Azure.

Static Azure Website => Next.

When you create your Azure DevOps project you can see the Flow steps for Creation.

For the Service of the Web App, there are two options in this deployment template :

  1. Web App for Containers
  2. Web App as a Service.

Azure Web Apps enables you to build and host web applications in the programming language of your choice without managing infrastructure. It offers auto-scaling and high availability, supports both Windows and Linux, and enables automated deployments from GitHub, Azure DevOps, or any Git repo

Web App for Containers provides built-in Docker images on Linux with support for specific versions, such as PHP 7.0 and Node.js 4.5. Web App for Containers uses the Docker container technology to host both built-in images and custom images as a platform as a service. In this tutorial, you learn how to build a custom Docker image and deploy it to Web App for Containers. This pattern is useful when the built-in images don’t include your language of choice, or when your application requires a specific configuration that isn’t provided within the built-in images.

The last step needs information about :

  • Organization: for the site name.
  • Projectname
  • Subscription ID
  • Web App Name
  • Azure Location.

And then click on Done

 

Deployment overview.

Your Azure DevOps Pipeline is Running as easy like that 🙂

But most important your Azure Web App is running.

Running in your Container in Azure Cloud Services.

Azure DevOps Container Web App Pipeline is running.

From here you can build your Project and Share it with your Developer Team.
More information you can find on Azure DevOps Docs

Here you see some snapshots on the latest Releases of Azure DevOps release features when I made this blogpost :

When you want to keep up-to-date on Microsoft Azure DevOps, here are some links :

Follow Microsoft Azure DevOps on Twitter

Start here free with Azure DevOps

Microsoft Azure DevOps Blog

JOIN the Azure DevOps Community Group on LinkedIn


Leave a comment

#Microsoft SQL Server 2019 Preview Overview #SQL #SQL2019 #Linux #Containers #MSIgnite

Microsoft SQL Server 2019 Preview

What’s New in Microsoft SQL Server 2019 Preview

• Big Data Clusters
o Deploy a Big Data cluster with SQL and Spark Linux containers on Kubernetes
o Access your big data from HDFS
o Run Advanced analytics and machine learning with Spark
o Use Spark streaming to data to SQL data pools
o Use Azure Data Studio to run Query books that provide a notebook experience

• Database engine
o UTF-8 support
o Resumable online index create allows index create to resume after interruption
o Clustered columnstore online index build and rebuild
o Always Encrypted with secure enclaves
o Intelligent query processing
o Java language programmability extension
o SQL Graph features
o Database scoped configuration setting for online and resumable DDL operations
o Always On Availability Groups – secondary replica connection redirection
o Data discovery and classification – natively built into SQL Server
o Expanded support for persistent memory devices
o Support for columnstore statistics in DBCC CLONEDATABASE
o New options added to sp_estimate_data_compression_savings
o SQL Server Machine Learning Services failover clusters
o Lightweight query profiling infrastructure enabled by default
o New Polybase connectors
o New sys.dm_db_page_info system function returns page information

• SQL Server on Linux
o Replication support
o Support for the Microsoft Distributed Transaction Coordinator (MSDTC)
o Always On Availability Group on Docker containers with Kubernetes
o OpenLDAP support for third-party AD providers
o Machine Learning on Linux
o New container registry
o New RHEL-based container images
o Memory pressure notification

• Master Data Services
o Silverlight controls replaced

• Security
o Certificate management in SQL Server Configuration Manager

• Tools
o SQL Server Management Studio (SSMS) 18.0 (preview)
o Azure Data Studio

Introducing Microsoft SQL Server 2019 Big Data Clusters

SQL Server 2019 big data clusters make it easier for big data sets to be joined to the dimensional data typically stored in the enterprise relational database, enabling people and apps that use SQL Server to query big data more easily. The value of the big data greatly increases when it is not just in the hands of the data scientists and big data engineers but is also included in reports, dashboards, and applications. At the same time, the data scientists can continue to use big data ecosystem tools while also utilizing easy, real-time access to the high-value data in SQL Server because it is all part of one integrated, complete system.

Read the complete Awesome blogpost from Travis Wright about SQL Server 2019 Big Data Cluster here

Starting in SQL Server 2017 with support for Linux and containers, Microsoft has been on a journey of platform and operating system choice. With SQL Server 2019 preview, we are making it easier to adopt SQL Server in containers by enabling new HA scenarios and adding supported Red Hat Enterprise Linux container images. Today we are happy to announce the availability of SQL Server 2019 preview Linux-based container images on Microsoft Container Registry, Red Hat-Certified Container Images, and the SQL Server operator for Kubernetes, which makes it easy to deploy an Availability Group.

SQL Server 2019 preview containers now available

Microsoft Azure Data Studio

Azure Data Studio is a new cross-platform desktop environment for data professionals using the family of on-premises and cloud data platforms on Windows, MacOS, and Linux. Previously released under the preview name SQL Operations Studio, Azure Data Studio offers a modern editor experience with lightning fast IntelliSense, code snippets, source control integration, and an integrated terminal. It is engineered with the data platform user in mind, with built-in charting of query resultsets and customizable dashboards.

Read the Complete Blogpost About Microsoft Azure Data Studio for SQL Server here

SQL Server 2019: Celebrating 25 years of SQL Server Database Engine and the path forward

Awesome work Microsoft SQL Team and Congrats on your 25th Anniversary !


Leave a comment

Watch the Live Stream Today of #Microsoft Ignite 2018 in Orlando 24 – 28 September #MSIgnite #Azure #Cloud #DevOps and More


Don’t miss the Live Stream of Microsoft Ignite 2018

Get the latest insights and skills from technology leaders and practitioners shaping the future of cloud, data, business intelligence, teamwork, and productivity. Immerse yourself with the latest tools, tech, and experiences that matter, and hear the latest updates and ideas directly from the experts.

Watch live https://www.microsoft.com/en-us/ignite as Microsoft CEO Satya Nadella lays out his vision for the future of tech, then watch other Microsoft leaders explore the most important tools and technologies coming in the next year. After the keynotes, select Microsoft Ignite sessions will stream live—take a deep dive into the future of your profession.


More then 700+ Sessions and 100+ Expert-led and self-paced workshops


#MSIgnite