mountainss Cloud and Datacenter Management Blog

Microsoft SystemCenter blogsite about virtualization on-premises and Cloud


Leave a comment

#Microsoft Windows Server Summit 2018 on June 26 #Winserv #WindowsAdminCenter #Containers #WindowsServerSummit

Microsoft Windows Admin Center

Join Microsoft Windows Server Summit on June 26, 2018

Join Microsoft on Tuesday, June 26, 2018 for a virtual experience to learn tips and tricks for modernizing your infrastructure and applications—regardless of whether you’re running Windows Server on-premises or in the cloud.

Here you find more Microsoft Information about the Windows Server Summit 2018


Windows Admin Center Rocks

#MVPbuzz


Advertisements


Leave a comment

#Microsoft Azure DevOps Projects and Infrastructure as Code #Azure #IaC #DevOps


Microsoft Azure DevOps Project for CI/CD

The Azure DevOps Project presents a simplified experience where you bring your existing code and Git repository, or choose from one of the sample applications to create a continuous integration (CI) and continuous delivery (CD) pipeline to Azure. The DevOps project automatically creates Azure resources such as a new Azure virtual machine, creates and configures a release pipeline in VSTS that includes a build definition for CI, sets up a release definition for CD, and then creates an Azure Application Insights resource for monitoring.

Infrastructure as Code (IaC) gives you benefits like :

  • Consistency in naming conventions of Azure components
  • Working together in the same way with your company policies
  • Reusability of Templates
  • Automatic documentation and CMDB of deployments in your repository
  • Rapid deployments
  • Flexibility and Scalability in code for Azure Deployments

As an Large Enterprise Company you don’t want to Click and Type in the Azure Portal with lot of employees to get the job done in a consistent way. The changes and deployments will be different in time because people can make mistakes. For Developers it’s important to make your building process before you publish your application, so why not for DevOps and ITpro to do the same thing for Infrastructure.

In the following step-by-step guide you will learn how to make a Microsoft Azure DevOps Project and make a CI/CD Pipeline deploying a virtual machine with your ASP.net Application.

Prerequisites :
An Azure subscription. You can get one free through Visual Studio Dev Essentials.
Access to a GitHub or external Git repository that contains .NET, Java, PHP, Node, Python, or static web code.

Here you find the GitHub for Developer Guide

When you have your prerequisites in place you can start with the following steps :

Search for DevOps at All Services in the Azure Portal

Select .NET and Click on Next

You can see where you are in the flow of creating your CI/CD Pipeline, when you need a Azure SQL Database for your ASP.net application you can select Add a Database (Option). This will provide you Azure SQL as a Service (PaaS).

Database-as-a-Service
(I didn’t Choose for SQL)


In this step select Virtual Machine and click Next

From here you can create a VSTS account or your Existing account of Visual Studio Team Services. After selecting VSTS you can manage your Azure settings and by clicking on Change you can select the Azure options.

 

Select the Virtual Machine you need for your Application.

Here you see the Deployment Running

Important for Infrastructure as Code (IaC), the Deployment template can be saved into the library and / or you can download it for reusability or make your own policies into the template.

When you save it into the Azure Library you get the release notes and who’s the publisher

In the Microsoft Azure DevOps Project main Dashboard you will see the status of your CI/CD Pipeline and that release is in progress or not. On the right-side of the Dashboard you see the Azure resources like the Application endpoint, the Virtual Machine and Application Insights for monitoring. When the CI/CD Pipeline deployment is succeeded you can browse to your ASP.net Application.

Your Application.

Your Virtual Machine Running and in the Monitoring.


The Microsoft Azure DevOps Project CI/CD Pipeline is Completed.

Application Insights is an extensible Application Performance Management (APM) service for web developers on multiple platforms. Use it to monitor your live web application. It will automatically detect performance anomalies. It includes powerful analytics tools to help you diagnose issues and to understand what users actually do with your app. It’s designed to help you continuously improve performance and usability. It works for apps on a wide variety of platforms including .NET, Node.js and J2EE, hosted on-premises or in the cloud. It integrates with your DevOps process, and has connection points to a variety of development tools. It can monitor and analyze telemetry from mobile apps by integrating with Visual Studio App Center and HockeyApp.

You can drill down into the error to see what is happening.

Azure Application Insights topology

Application Insights is aimed at the development team, to help you understand how your app is performing and how it’s being used. It monitors:
Request rates, response times, and failure rates – Find out which pages are most popular, at what times of day, and where your users are. See which pages perform best. If your response times and failure rates go high when there are more requests, then perhaps you have a resourcing problem.
Dependency rates, response times, and failure rates – Find out whether external services are slowing you down.
Exceptions – Analyse the aggregated statistics, or pick specific instances and drill into the stack trace and related requests. Both server and browser exceptions are reported.
Page views and load performance – reported by your users’ browsers.
AJAX calls from web pages – rates, response times, and failure rates.
User and session counts.
Performance counters from your Windows or Linux server machines, such as CPU, memory, and network usage.
Host diagnostics from Docker or Azure.
Diagnostic trace logs from your app – so that you can correlate trace events with requests.
Custom events and metrics that you write yourself in the client or server code, to track business events such as items sold or games won.

You can also drill down into Microsoft Azure Log Analytics and run your analytics queries to get the right information you want for troubleshooting. More information on Azure Log Analytics and queries is on MSFT docs.

From App Insight we see it was an Exception error

Because the Azure DevOps Project is connected with VSTS you can follow the Build and Release here to and you got your documentation of the CI/CD Pipeline.

From here you can work with your Developers and DevOps and manage the User and Groups security in de CI/CD Pipeline for the next Build. Working together to build innovative apps via VSTS from one Dashboard :

VSTS Dashboard

Next day you see it was one time error and the Pipeline is running Fine 😉

For more information about all the possibilities with Microsoft Azure DevOps Project go to MSFT Docs

DevOps and Microsoft :

DevOps is the union of people, process, and products to enable continuous delivery of value to our end users.

To Learn DevOps please visit this Microsoft DevOps Site

Conclusion : 

Invest in your CI/CD Pipeline and make your own environment is important before you deploy into Azure production for your business. Make your ARM Templates and Code in repositories like Git or VSTS. When you have this all in place your are more in control of your consistent Deployments and Changes in the Azure Cloud. I hope this blogpost is useful for you and your Company. Start today with Infrastructure as Code (IaC) and get the benefits 😉


Leave a comment

#Microsoft Azure Storage Tools AzCopy and #Azure Storage Explorer #Cloud #AzureStack

Microsoft Azure Storage tools
Type azcopy /? for help on AzCopy.
C:\Program Files (x86)\Microsoft SDKs\Azure>azcopy /?
——————————————————————————
AzCopy 7.1.0 Copyright (c) 2017 Microsoft Corp. All Rights Reserved.
——————————————————————————

AzCopy </Source:> </Dest:> [/SourceKey:] [/DestKey:] [/SourceSAS:] [/DestSAS:]
[/V:] [/Z:] [/@:] [/Y] [/NC:] [/SourceType:] [/DestType:] [/S]
[/Pattern:] [/CheckMD5] [/L] [/MT] [/XN] [/XO] [/A] [/IA] [/XA]
[/SyncCopy] [/SetContentType] [/BlobType:] [/Delimiter:] [/Snapshot]
[/PKRS:] [/SplitSize:] [/EntityOperation:] [/Manifest:]
[/PayloadFormat:]
##
## Common Options ##
##
/Source:<source> Specifies the source data from which to copy.
The source can be a directory including:
a file system directory, a blob container,
a blob virtual directory, a storage file share,
a storage file directory, or an Azure table.
The source can also be a single file including:
a file system file, a blob or a storage file.
The source is interpreted according to following rules:
1) When either file pattern option /Pattern or
recursive mode option /S is specified,
the source will be interpreted to a directory.
2) When both file pattern option /Pattern and
recursive mode option /S are not specified,
the source can be a single file or a directory.
In this case, AzCopy will choose an existing
location as the source, if the source is both
an existing file and an existing directory,
the source will be interpreted to a single file.

/Dest:<destination> Specifies the destination to copy to.
The destination can be a directory including:
a file system directory, a blob container,
a blob virtual directory, a storage file share,
a storage file directory, or an Azure table.
The destination can also be a single file including:
a file system file, a blob or a storage file.
The destination is interpreted according to following rules:
1) When source is a single file, destination
is interpreted as a single file.
2) When source is a directory, destination
is interpreted as a directory.

/SourceKey:<storage-key> Specifies the storage account key for the
source resource.

/DestKey:<storage-key> Specifies the storage account key for the
destination resource.

/SourceSAS:<SAS-Token> Specifies a Shared Access Signature with READ
and LIST permissions for the source (if
applicable). Surround the SAS with double
quotes, as it may contains special command-line
characters.
The SAS must be a Container/Share/Table SAS, or
an Account SAS with ResourceType that includes
Container.
If the source resource is a blob container,
and neither a key nor a SAS is provided, then
the blob container will be read via anonymous
access.
If the source is a file share or table, a key or
a SAS must be provided.

/DestSAS:<SAS-Token> Specifies a Shared Access Signature (SAS) with
READ and WRITE permissions for the
destination (if applicable). When /Y is
specified, and /XO /XN are not specified, the SAS
can have only WRITE permission for the operation
to succeed.
Surround the SAS with double quotes, as it may
contains special command-line characters.
The SAS must be a Container/Share/Table SAS, or
an Account SAS with ResourceType that includes
Container.
If the destination resource is a blob container,
file share or table, you can either specify this
option followed by the SAS token, or you can
specify the SAS as part of the destination blob
container, file share or table’s URI, without
this option.
This option is not supported when asynchronously
copying between two different types of storage
service or between two different accounts.

/V:[verbose-log-file] Outputs verbose status messages into a log
file.
By default, the verbose log file is named
AzCopyVerbose.log in
%LocalAppData%\Microsoft\Azure\AzCopy. If you
specify an existing file location for this
option, the verbose log will be appended to
that file.

/Z:[journal-file-folder] Specifies a journal file folder for resuming an
operation.
AzCopy always supports resuming if an
operation has been interrupted.
If this option is not specified, or it is
specified without a folder path, then AzCopy
will create the journal file in the default
location, which is
%LocalAppData%\Microsoft\Azure\AzCopy.
Each time you issue a command to AzCopy, it
checks whether a journal file exists in the
default folder, or whether it exists in a
folder that you specified via this option. If
the journal file does not exist in either
place, AzCopy treats the operation as new and
generates a new journal file.
If the journal file does exist, AzCopy will
check whether the command line that you input
matches the command line in the journal file.
If the two command lines match, AzCopy resumes
the incomplete operation. If they do not match,
you will be prompted to either overwrite the
journal file to start a new operation, or to
cancel the current operation.
The journal file is deleted upon successful
completion of the operation.
Note that resuming an operation from a journal
file created by a previous version of AzCopy
is not supported.

/@:<parameter-file> Specifies a file that contains parameters.
AzCopy processes the parameters in the file
just as if they had been specified on the
command line.
In a response file, you can either specify
multiple parameters on a single line, or
specify each parameter on its own line. Note
that an individual parameter cannot span
multiple lines.
Response files can include comments lines that
begin with the # symbol.
You can specify multiple response files.
However, note that AzCopy does not support
nested response files.

/Y Suppresses all AzCopy confirmation prompts.

/NC:<number-of-concurrent> Specifies the number of concurrent operations.
AzCopy by default starts a certain number of
concurrent operations to increase the data
transfer throughput.
Note that large number of concurrent operations
in a low-bandwidth environment may overwhelm
the network connection and prevent the
operations from fully completing. Throttle
concurrent operations based on actual available
network bandwidth.
The upper limit for concurrent operations is
512.

##
## Options – Applicable for Blob and Table Service Operations ##
##

/SourceType:<blob | table> Specifies that the source resource is a blob
or table available in the local development
environment, running in the storage emulator.

/DestType:<blob | table> Specifies that the destination resource is a
blob or table available in the local
development environment, running in the
storage emulator.

##
## Options – Applicable for Blob and File Service Operations ##
##

/S Specifies recursive mode for copy operations.
The /S parameter is only valid when the
source is a directory.
In recursive mode, AzCopy will copy all blobs
or files that match the specified file
pattern, including those in subfolders.

/Pattern:<file-pattern> Specifies a file pattern that indicates which
files to copy.
The behavior of the /Pattern parameter is
determined by the location of the source data,
and the presence of the recursive mode option.
The /Pattern parameter is only valid when the
source is a directory.
Recursive mode is specified via option /S.

If the specified source is a directory in
the file system, then standard wildcards are
in effect, and the file pattern provided is
matched against files within the directory.
If option /S is specified, then AzCopy also
matches the specified pattern against all
files in any subfolders beneath the directory.

If the specified source is a blob container or
virtual directory, then wildcards are not
applied. If option /S is specified, then AzCopy
interprets the specified file pattern as a blob
prefix. If option /S is not specified, then
AzCopy matches the file pattern against exact
blob names.
If the specified source is an Azure file share,
then you must either specify the exact file
name, (e.g. abc.txt) to copy a single file, or
specify option /S to copy all files in the
share recursively. Attempting to specify both a
file pattern and option /S together will result
in an error.

AzCopy uses case-sensitive matching when the
/Source is a blob, blob container or blob virtual
directory, and uses case-insensitive matching
in all the other cases.

The default file pattern used when no file
pattern is specified is *.* for a file system
location or an empty prefix for an Azure
Storage location.
Specifying multiple file patterns is not
supported.

/CheckMD5 Calculates an MD5 hash for downloaded data and
verifies that the MD5 hash stored in the blob
or file’s Content-MD5 property matches the
calculated hash. The MD5 check is turned off by
default, so you must specify this option to
perform the MD5 check when downloading data.
Note that Azure Storage doesn’t guarantee that
the MD5 hash stored for the blob or file is
up-to-date. It is client’s responsibility to
update the MD5 whenever the blob or file is
modified.
AzCopy always sets the Content-MD5 property for
an Azure blob or file after uploading it to the
service.

/L Specifies a listing operation only; no data is
copied.
AzCopy will interpret the using of this option as
a simulation for running the command line without
this option /L and count how many objects will
be copied, you can specify option /V at the same
time to check which objects will be copied in
the verbose log.
The behavior of this option is also determined by
the location of the source data and the presence
of the recursive mode option /S and file pattern
option /Pattern.
When using this option, AzCopy requires LIST and READ
permission of the source location if source is a directory,
or READ permission of the source location if source
is a single file.

/MT Sets the downloaded file’s last-modified time
to be the same as the source blob or file’s.

/XN Excludes a newer source resource. The resource
will not be copied if the source is the same
or newer than destination.

/XO Excludes an older source resource. The resource
will not be copied if the source resource is the
same or older than destination.

/A Uploads only files that have the Archive
attribute set.

/IA:[RASHCNETOI] Uploads only files that have any of the
specified attributes set.
Available attributes include:
R Read-only files
A Files ready for archiving
S System files
H Hidden files
C Compressed file
N Normal files
E Encrypted files
T Temporary files
O Offline files
I Not content indexed Files

/XA:[RASHCNETOI] Excludes files from upload that have any of the
specified attributes set.
Available attributes include:
R Read-only files
A Files ready for archiving
S System files
H Hidden files
C Compressed file
N Normal files
E Encrypted files
T Temporary files
O Offline files
I Not content indexed Files

/SyncCopy Indicates whether to synchronously copy blobs
or files among two Azure Storage end points.
AzCopy by default uses server-side
asynchronous copy. Specify this option to
download the blobs or files from the service
to local memory and then upload them to the
service.
/SyncCopy can be used in below scenarios:
1) Copying from Blob storage to Blob storage.
2) Copying from File storage to File storage.
3) Copying from Blob storage to File storage.
4) Copying from File storage to Blob storage.

/SetContentType:[content-
type] Specifies the content type of the destination
blobs or files.
AzCopy by default uses
“application/octet-stream” as the content type
for the destination blobs or files. If option
/SetContentType is specified without a value
for “content-type”, then AzCopy will set each
blob or file’s content type according to its
file extension. To set same content type for
all the blobs, you must explicitly specify a
value for “content-type”.

##
## Options – Only applicable for Blob Service Operations ##
##

/BlobType:<page | block
| append> Specifies whether the destination blob is a
block blob, a page blob or an append blob.
If the destination is a blob and this option
is not specified, then by default AzCopy will
create a block blob.

/Delimiter:<delimiter> Indicates the delimiter character used to
delimit virtual directories in a blob name.
By default, AzCopy uses / as the delimiter
character. However, AzCopy supports using any
common character (such as @, #, or %) as a
delimiter. If you need to include one of these
special characters on the command line, enclose
it with double quotes.
This option is only applicable for downloading
from an Azure blob container or virtual directory.

/Snapshot Indicates whether to transfer snapshots. This
option is only valid when the source is a
blob container or blob virtual directory.
The transferred blob snapshots are renamed in
this format: [blob-name] (snapshot-time)
[extension].
By default, snapshots are not copied.

##
## Options – only applicable for Table Service Operations ##
##

/PKRS:<“key1#key2#key3#…”> Splits the partition key range to enable
exporting table data in parallel, which
increases the speed of the export operation.
If this option is not specified, then AzCopy
uses a single thread to export table entities.
For example, if the user specifies
/PKRS:”aa#bb”, then AzCopy starts three
concurrent operations.
Each operation exports one of three partition
key ranges, as shown below:
[<first partition key>, aa)
[aa, bb)
[bb, <last partition key>]

/SplitSize:<file-size> Specifies the exported file split size in MB.
If this option is not specified, AzCopy will
export table data to single file.
If the table data is exported to a blob, and
the exported file size reaches the 200 GB limit
for blob size, then AzCopy will split the
exported file, even if this option is not
specified.

/EntityOperation:<InsertOrSkip
| InsertOrMerge
| InsertOrReplace> Specifies the table data import behavior.
InsertOrSkip – Skips an existing entity or
inserts a new entity if it does not exist in
the table.
InsertOrMerge – Merges an existing entity or
inserts a new entity if it does not exist in
the table.
InsertOrReplace – Replaces an existing entity
or inserts a new entity if it does not exist
in the table.

/Manifest:<manifest-file> Specifies the manifest file name for the table
export and import operation.
This option is optional during the export
operation, AzCopy will generate a manifest file
with predefined name if this option is not
specified.
This option is required during the import
operation for locating the data files.

/PayloadFormat:<JSON | CSV> Specifies the format of the exported data file.
If this option is not specified, by default
AzCopy exports data file in JSON format.

##
## Samples ##
##

#1 – Download a blob from Blob storage to the file system, for example,
download ‘https://myaccount.blob.core.windows.net/mycontainer/abc.txt&#8217;
to ‘D:\test\’
a) Use directory transfer if you have READ and LIST permission of the source data:
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer/
/Dest:D:\test\ /SourceKey:key /Pattern:”abc.txt”
b) Use single file transfer if you have READ permission of the source data:
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer/abc.txt
/Dest:D:\test\abc.txt /SourceSAS:”<SourceSASWithReadPermission>”

#2 – Copy a blob within a storage account
a) Use directory transfer if you have READ and LIST permission of the source data:
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer1/
/Dest:https://myaccount.blob.core.windows.net/mycontainer2/
/SourceKey:key /DestKey:key /Pattern:”abc.txt”
b) Use single file transfer if you have READ permission of the source data:
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer1/abc.txt
/Dest:https://myaccount.blob.core.windows.net/mycontainer2/abc.txt
/SourceSAS:”<SourceSASWithReadPermission>” /DestKey:key

#3 – Upload files and subfolders in a directory to a container, recursively
AzCopy /Source:D:\test\
/Dest:https://myaccount.blob.core.windows.net/mycontainer/
/DestKey:key /S

#4 – Upload files matching the specified file pattern to a container,
recursively.
AzCopy /Source:D:\test\
/Dest:https://myaccount.blob.core.windows.net/mycontainer/ /DestKey:key
/Pattern:*ab* /S

#5 – Download blobs with the specified prefix to the file system, recursively
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer/
/Dest:D:\test\ /SourceKey:key /Pattern:”a” /S

#6 – Download files and subfolders in an Azure file share to the file system,
recursively
AzCopy /Source:https://myaccount.file.core.windows.net/mycontainer/
/Dest:D:\test\ /SourceKey:key /S

#7 – Upload files and subfolders from the file system to an Azure file share,
recursively
AzCopy /Source:D:\test\
/Dest:https://myaccount.file.core.windows.net/mycontainer/
/DestKey:key /S

#8 – Export an Azure table to a local folder
AzCopy /Source:https://myaccount.table.core.windows.net/myTable/
/Dest:D:\test\ /SourceKey:key

#9 – Export an Azure table to a blob container
AzCopy /Source:https://myaccount.table.core.windows.net/myTable/
/Dest:https://myaccount.blob.core.windows.net/mycontainer/
/SourceKey:key1 /Destkey:key2

#10 – Import data in a local folder to a new table
AzCopy /Source:D:\test\
/Dest:https://myaccount.table.core.windows.net/mytable1/ /DestKey:key
/Manifest:”myaccount_mytable_20140103T112020.manifest”
/EntityOperation:InsertOrReplace

#11 – Import data in a blob container to an existing table
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer/
/Dest:https://myaccount.table.core.windows.net/mytable/ /SourceKey:key1
/DestKey:key2 /Manifest:”myaccount_mytable_20140103T112020.manifest”
/EntityOperation:InsertOrMerge

#12 – Synchronously copy blobs between two Azure Storage endpoints
AzCopy /Source:https://myaccount1.blob.core.windows.net/mycontainer/
/Dest:https://myaccount2.blob.core.windows.net/mycontainer/
/SourceKey:key1 /DestKey:key2 /Pattern:ab /SyncCopy

——————————————————————————
Learn more about AzCopy and Download at
http://aka.ms/azcopy.
——————————————————————————

 

Download Microsoft Azure Storage Explorer here


Leave a comment

Impressions of Microsoft #MSTechSummit in Amsterdam 2018 #MVPbuzz

Microsoft Tech Summit 2018 Amsterdam

It’s Really Awesome to Help Microsoft on the #MSTechSummit in Amsterdam for the community doing Q&A on the Microsoft Experts Center Booth and talking with customers on real scenarios about moving to the Microsoft Azure Cloud. Questions like What are the best practices, and what can I do with Microsoft Azure Stack in my own datacenter. Where can I get more information ? Solving problems for the customer by giving them directions where they can find the solution. Supporting customers with the On-Demand LABS and answering the questions they have, It’s just Great to be a Microsoft MVP Cloud and Datacenter Management and support the Community in this way on the Microsoft Tech Summit 2018 in Amsterdam 🙂

Here you see some impressions of the two days Event :

The Entrance in Amsterdam RAI on the Day before the Event

Getting registered as a Speaker on the Day before the MSTechSummit begins.

The Azure Keynote with Tad Brockway

Impressive Virtual Machine on Azure Cloud Services

Supporting the Community on the Experts Booth doing Q&A

And of course you can meet Great Microsoft employees from Redmond 🙂

On the Picture with Seth Juarez He Likes Machine Learning and AI
and of course working on CH9
 

And on the Picture with Jeff Woolsey from the Microsoft Server Team.
Install Project Honolulu for Remote Management 😉

And YES you can do Clustering on Microsoft Azure !
Have a look at Robert Smit his Blog Site

Meeting MVP mate from Austria Toni Pohl
He is developing cloud solutions with Office365 and Azure

The HUB

A full House for the Break-Out Session Azure Stack with Natalia Mackevicius

She is Director PM Azure Stack

Community Center and Experts Booth
Join the Microsoft Tech Community Today
#MVPbuzz

Microsoft LABS on Demand are Ready to Rock !

Thank you Microsoft and Community for this Awesome Event !
Microsoft Tech Summit 2018 Amsterdam


Leave a comment

Microsoft #Azure DevTest LAB is Great for #Education and #DevOps

Azure DevTest Labs can be used to implement many key scenarios in addition to dev/test. One of those scenarios is to set up a lab for training. Azure DevTest Labs allows you to create a lab where you can provide custom templates that each trainee can use to create identical and isolated environments for training. You can apply policies to ensure that training environments are available to each trainee only when they need them and contain enough resources – such as virtual machines – required for the training. Finally, you can easily share the lab with trainees, which they can access in one click.

To Create your own DevTest Lab is easy in Microsoft Azure subscription :

Select Developer tools and then DevTest Labs

Give you DevTest Lab a Name and Resource

I already got it installed with some Virtual Machines.

When you go to Configuration and Policies you can configure your DevTest LAB for your Users.

From here you can manage and Configure your DevTest LAB.

Costs per Resource and who is the Owner

You can give your DevTest LAB Users full Control on Virtual Machine Sizes, but then you have to watch your Costs.
To keep you in Control you can decide which VM sizes can be selected by the DevTest LAB Users. From small standard A2 VM
or Powerful GS5 Virtual Machine.

Then you can select how much Virtual Machines can be selected by the DevTest LAB User or How Much Virtual Machines can be added to a Complete DevTest LAB :

Virtual Machines Per User

Virtual Machines per LAB

Here you can make a DevTest LAB Announcement for the Users.

To keep you Costs in a efficient way in Control, you can Auto Shutdown the Complete LAB and Start it with a Scheduler again.
Then you don’t pay for Compute when It’s not in use, this keeps your total costs low.

Auto Start

Auto Shutdown

Important for your Azure DevTest LAB are the Images form the Market Place but you can also upload your own custom images :

Azure Market Place

Of course you can add also Repositories to your Azure DevTest LAB :

When you installed this all, you can configure your Identity and Access Management for your Azure DevTest LAB Users.

I Gave Student01 the Role DevTest Lab User.

When you Login with the Azure DevTest LAB User you see your Resources and the LAB.

In the Activity Log you can see what is happening in your LAB.

For Teachers in Education is Microsoft Azure DevTest LAB a Great solution to work with IT Students and Develop or making there own Projects for School.

Here you see how easy you can role out a Kali-Linux Virtual Machine in your Azure DevTest LAB 

Select the Kali-Linux Image

Select your Virtual Machine Settings

Here you can select Artifacts for in your VM

You can download your JSON ARM Template here

Your Kali-Linux VM is Creating in your Azure DevTest LAB.

I like Microsoft Azure DevTest LAB a lot and I hope you too 🙂

More information about Microsoft Azure DevTest LAB is here on Docs


Leave a comment

Build a Company in #Azure Video

Content
– The Azure Portal 00:05:00
– Networking in Azure 00:10:12
– Azure Virtual Machines 00:22:16
– Containers and Kubernetes Orchestration 00:50:57
– Directory Services and Azure AD 01:03:39
– DevTest Labs 01:18:23
– Backup and Disaster Recovery 01:29:48
– WebApps 01:37:15
– Automating Social Media 01:55:05
– Bots and Cognitive Service APIs 02:11:44
– Securing the Azure Cloud 02:23:45

Thanks to Daniel baker 😉

Azure Citadel site


Leave a comment

Infrastructure as a Service (IaaS) with Microsoft #Azure #Cloud #AzureStack #HybridCloud

Break down video of the essentials needed to plan and implement your solutions on Microsoft Azure IaaS. This 7-minute intro covers compute, virtual machines, containers, networking, storage and management options in Microsoft Azure.

When you transform your datacenter on-premises to Microsoft Azure Cloud Service, these Architecture references can help you
to make the right chooses for your business needs. The Azure Architecture Center contains guidance for building end-to-end solutions on Microsoft Azure. Here you will find reference architectures, best practices, design patterns, scenario guides, and reference implementations.

Start here for your Microsoft Azure Architecture designs

Microsoft Azure Architecture Center

On the left site of this page you can download the complete content of Microsoft Azure Architecture Center into a PDF file 😉
Looks like this :

When your transition and your Architecture is done on Paper you can move save to Microsoft Azure Cloud Services.

Accelerate your digital transformation:
Now is the time to move to Azure and reap the rewards of cloud technology, including the ability to scale up or down quickly, pay only for what you use, and save on compute power. Whether you are deploying new virtual machines, moving a few workloads, or migrating your datacenters as part of your hybrid cloud strategy, the Azure Hybrid Benefit provides big savings as you move to the cloud.

Have a look at the Microsoft Azure Hybrid Use Benefit

Here you find some handy links to Microsoft Azure Cloud Services :

Microsoft Azure Products Technical docs

Microsoft Azure SDK and Tools

Getting started with Microsoft Azure products

Microsoft Azure Resources

Here you find the Whitepaper of Azure Virtual Datacenter Lift and Shift Guide but also an E-book of Azure Virtual Datacenter from the Azure CAT Guidance Team which can help you to start your transition of your datacenter to the Microsoft Azure Cloud.

 

Microsoft Mechanics all Azure

When you have workloads in your on-premises Datacenter which may not run in any public Cloud or via Internet, you can run Microsoft Azure in your Datacenter via Microsoft Azure Stack.

Build modern applications across hybrid cloud environments

Azure Stack is an extension of Azure, bringing the agility and fast-paced innovation of cloud computing to on-premises environments. Only Azure Stack lets you deliver Azure services from your organization’s datacenter, while balancing the right amount of flexibility and control—for truly-consistent hybrid cloud deployments.

Microsoft Azure Stack Overview

Hope this blogpost will help you out with your journey to the Microsoft Azure Cloud.