At Coders Cafe: PowerShell – Introduction to SQL Server Containers

I’ll be presenting at the South Florida .NET User Group Coders Cafe on Tuesday, August 8th, 6:30 PM. Location: Cendyn Spaces, Boca Raton.

Topic: PowerShell – Introduction to SQL Server Containers

Description: This session will be covering the basic of working with Containers and PowerShell Core. We’ll be taking the steps of creating a SQL Server 2019 container in an Ubuntu 18.04 Linux system. Then, will be using PowerShell Core to connect to the SQL Server containers to extract information.

  

Interested in attending this session, click here to register.

Using Linux dpkg packager to install PowerShell 7 Preview in Ubuntu 18.04

Just another way to install PowerShell Preview beside using “apt” or “snap”.  As in this sample, you don’t need to register the package repository.

Get the Preview link

First, look under the release documentation and search for the deb package. In my case I’m install the amd64 version.

Then, right-click on the “powershell-preview_7.0.0-preview.2-1.ubuntu.18.04_amd64.deb”, and select “Copy link address“.

This will copy the following link address:

https://github.com/PowerShell/PowerShell/releases/download/v7.0.0-preview.2/powershell-preview_7.0.0-preview.2-1.ubuntu.18.04_amd64.deb

Download the Preview

Now, I go back to my linux machine and open a terminal session, and I make sure to change directory to the “Downloads” folder.

cd Downloads

Then, I type the following command and the link address:

wget https://github.com/PowerShell/PowerShell/releases/download/v7.0.0-preview.2/powershell-preview_7.0.0-preview.2-1.ubuntu.18.04_amd64.deb

Installing the Preview

Now, I’m ready to install the preview using the dpkg package installer executing the following command:

sudo dpkg -i powershell-preview_7.0.0-preview.2-1.ubuntu.18.04_amd64.deb

Now, we can start working with PowerShell.

In Summary

You can pick and choose the best way to install PowerShell. So, it really takes a few lines get the PowerShell Preview installed quickly.

Reference

Keep learning more PowerShell!

PowerShell Core – Working with Persistent Disk Storage in Docker Containers

This quick blog post will hope to give you a heads up in how to work with container(s) disk data. It’s a known fact that container(s) storing data will not persist if the container is removed. Yes! If you build a container to store your data, it will be gone.

Containers are perfectly suited for testing, meant to fast deployment of a solution, and can be easily deployed to the cloud. It’s cost effective!

Very important to understand! Containers disk data only exist while the container is running. If the container is removed, that data is gone.

So, you got to find the way to properly configure your container environment to make the data persist on disk.

Persisting Data

There are *two quick way to persist data when working with container(s):

1. Create a docker volume.
2. Or, use a local machine folder area.

*Note: There are other solution to help with persisting data for containers, but this a good starting point.

I’m using the docker command line for now. Later, I will be creating some blog post about using Docker Compose and Kubernetes.

I love to use PowerShell Core with Docker command line!

Docker Create Volume

Using docker command “docker volume create <nameofvolume>” will create the volume to help persist data on your local machine.

docker volume create MyLinuxData

Use the following docker commands to check your newly created volume:

* To list all existing docker volume(s):

docker volume ls

* To check “inspect” a docker volume(s) to provide detail information:

docker volume inspect MyLinuxData

Using the “docker volume inspect <VolumeName>.” command line, it will show the volume mount location:

“Mountpoint”: “/var/lib/docker/volumes/MyLinuxData/_data”,

In this case, the mount location is on the Linux box under the Docker Volumes folder. This means all data can persist on you local machine.

Local Machine Folder

This option seems straight forward as there’s no need to create a Docker Volume. Just use the ‘-v’ switch in the Docker Run command line.

In the following command line I’m activating the Docker container with previously configured Microsoft SQL Server instance. I include the ‘-v’ switch to mount a folder on my local machine.

docker run -p 1455:1455 -v /home/maxt/TempSQLBackups:/home/TempSQLBackups --name sql2k19ctp23_v02 -d sql2k19_ctp2.3_sandbox:CTP2.3-Version02

Notice in this case, to verify that my SQL Server container has mount to my the local machine folder, I can execute the following command:

docker exec -i sql2k19ctp23_v02 ls /home/TempSQLBackups

Using “docker exec -i <containerid/name> ls <containerfolderlocation” will display the results of all the files back to the screen. Now, anything you add to that local folder will be accessible to the container.

Summary

This is a good starting point when learning how to work with Docker data in containers. You’ll still go thru trails-and-errors while learning how to build container images, and make data persist for your application. But, it’s much faster and easier to rebuild images. This is one of a most to learn technology.

References

Check out the following blog post as it help me understand about “Persistent Storage”:

PowerShell Core – Updating your SQL Server Linux Docker Containers Images

In this post I’ll be covering how to install some needed components, how to commit the changes, and create a revised images for deployment.

In recent event and meetings, I’ve been talking about how to work SQL Server Linux Containers Docker images. As these images get your container up-and-running quickly they lacks some tools that may be useful to complete the SQL Server configuration.

What’s missing?

The SQL Server images contains a small footprint of Linux Ubuntu 16.04 Operating System (OS) and is meant for quick deployment. The OS side the container need to be kept updated regularly.

At the same time, when you starts exploring inside the container, there still missing components you may want to use:

  • vim – for editing text files.
  • ifconfig – to check your network interfaces.
  • ping – to check IP-Address can be reachable across the network.
  • curl – for transfering data.

So, after you pull the docker image, create the container using “docker run …“, and then get to the container Bash session by using “docker exec -it …“. Remember the bash session only get you to the “root” level as there’s no users set on these containers.

## - First time setup: (for "server:2019-CTP2.2-ubuntu" and )
docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=$SqlPwd01A' -e "MSSQL_PID=Developer" -p 1433:1433 --name sql2k19_CTP2.3 -d mcr.microsoft.com/mssql/server:2019-CTP2.3-ubuntu;

## - Display all active containers;
docker ps -a

At this point make sure the active container status should be in “Up” status. Now can proceed to update the container.

Installing Missing Components

To have access to the container we use the “docker exec …” command.  This command will allow to get access to the container “root” prompt.

## - Configuring your container:
docker exec -it sql2k19_CTP2.3 bash

The first thing I would suggest to do, execute the following to commands:

## - Updating OS:
apt update

apt upgrade

Notice if you try to execute: vim, ping, ifconfig, and curl are not installed in the container images.

Let’s proceed to install these component by executing the following command:

## - Installing additional components:
apt-get -y install \
curl \
vim \
iputils-ping \
net-tools \
powershell-preview

Also, it’s a good idea to create a Downloads folder in case to install other application(s).

## - Create Downloads folder in root:
mkdir Downloads
chmod 755 Downloads

Notice that PowerShell Core Preview was included with the other missing components.  PowerShell has become a great tool to have in a Linux environment.

PowerShell Core SQLServer Module

Although, this is optional but it doesn’t prevent you to include PowerShell Core Preview 6.2.0-RC1 with the SqlServer module which included the “Invoke-Sqlcmd” use by many administrator.  This is a great module to have in a SQL Server container image.

So, from the “root” prompt in the container open PowerShell Core Preview, then proceed to install the SqlServer module preview version 21.1.18095.

## - Open PowerShell Core:
pwsh-preview

## - Install SqlServer module preview:
Install-Module SQLServer -AllowPreRelease

This completes the essential for using PowerShell to help managing a SQL Server instance(s).

How About Anaconda?

We could install the latest version of Anaconda with Python 3.7 in our SQL Server container image.

## - Change directory to Downloads folder:
cd Downloads

## - Download Anaconda with Python 3.7:
wget https://repo.anaconda.com/archive/Anaconda3-2018.12-Linux-x86_64.sh

## - Install Anaconda with Python 3.7:
bash Anaconda3-2018.12-Linux-x86_64.sh

This will give us the ability to test Python scripts within the container.

Testing installed Components

We need to verify that all previously installed components are working. Go back to the container “root” prompt, and to execute the commands:

ifconfig
ping 127.0.0.1
vim ~/.bashrc
pwsh
sqlcmd

Now, executing “sqlcmd” command line will not work unless you add the path to the executable to the “root” ~/.bashrc file:

## - Need to include the path to SQLCMD command:
echo 'export PATH="$PATH:/opt/mssql-tools/bin"' >> ~/.bashrc

## - Refresh ~/.bashrc:
source ~/.bashrc

## - Run Sqlcmd command:
sqlcmd -L localhost -U sa -P 'sapwd'
> select @@version
> go
> exit

This is a good indication that our *SQL Server container is active. And, now we got all missing components installed.

Now, we need to make sure we don’t lose out changes.

Creating your own SQL Server Docker image

This is an important step so you won’t lose the changes already made to the container.  Below are the brief step to follow:

## - Commit the container changes: (repository name must be lowercase but Tags are OK with uppercase)
## -> docker commit "<Get-Container_ID>" "<Image-name>":"<TAG name>"

docker commit "<Get-Container_ID>" sql2k19_ctp2.3_sandbox:CTP2.3-Version01

## - List images included the committed ones:
docker images

## - Stop Image before the Save step:
docker stop sql2k19_CTP2.3
docker ps -a

## - Save docker updated image:
docker save -o ./Downloads/sql2k19ctp23_sandboxVer01.tar sql2k19_ctp2.3_sandbox

The “docker commit …” command, you’ll provide both the image-name (all lowercase) and a TAG name (uppercase allowed). You can be creative in having an naming conversion for you images repositories.

It’s very important to save images after doing the commit. I found out that having an active container would be useless without an image.  As far as I know, I haven’t found a way to rebuild an image from an existing container if the image was previously removed.

Summary

Hope this brief run down on working with SQL Server Docker container images will get you started with modifying existing images for quick deployment.

One thing to keep in mind!

  • The SQL Server Container memory need to be 4GB minimum.
  • In Windows, if your’re using non-Hyper-V virtualization tools such as Virtualbox, the virtual machine memory need to be change to 4GB.
  • Also, when you are creating images, the virtual machine disk size default is 20GB. This may need to be increase unless you keep cleaning/removing images to make room.

Just layout what you need, commit, save and deploy your docker solution in your environment.

Keep learning about this amazing technology!

 

PowerShell Core – How to install SQLServer Module (Preview) in WSL – Ubuntu 18.04

You all have been following Aaron Nelson blog post on Invoke-Sqlcmd availability Cross-Platform in the SqlServer module then you all are probably have proceeded to download the PowerShell SqlServer.

At the same time, on March 5th,  PowerShell Core 6.2.0-rc1 (Release Candidate) was made available for download.
Go and get it!

The thing is, in order to use the Invoke-SqlCmd cmdlet, you need to use PowerShell Preview version 6.2.0-rc1 (or greater).

Now, SqlServer Module can be easily installed in all platforms but I found out that it won’t install in Windows 10 WSL Ubuntu 18.04.

So, What the issue with Windows 10 WSL – Ubuntu 18.04?

Normally, when working with modules in PowerShell Core, I always use the following Cmdlets: Uninstall-Module to remove the module and then Install-Module with the “-AllowPrerelease” parameter. This will work flawlessly, but I found out that it won’t installed it in WSL – Ubuntu 18.04.

I don’t know why but it was installing the non-preview version SqlServer module 21.1.18080.   So, the following PowerShell Core command line will force the installation of SqlServer module version 21.1.18095-Preview.

Install-Module sqlserver -RequiredVersion 21.1.18095-preview -AllowPrerelease -Force

Now, we can start writing PowerShell Core SQL Server scripts in our Windows Subsystem for Linux – Ubuntu 18.04.

PowerShell – Docker Setup for Windows 10 WSL Ubuntu 18.04 with VMware Workstation

The purpose of this blog post is to show how to setup Docker Community Edition in a Windows 10 with VMware Workstation to be use in Windows Subsystem for Windows (WSL).

There are a few blog post that helped me figure out what’s needed to get this to work and I’ll be sharing these links at the end of this post.

My current environment

My current environment consist of the following components:

  • Windows 10 Build 17763
  • VMware Workstation Pro 12
  • *Oracle Virtualbox 5.2
  • WSL – Ubuntu 18.04
  • SQL Server 2017 Developer Edition
  • Windows PowerShell (v5.1.17763.316)
  • PowerShell Core GA v6.3.1 (both Windows and Linux)
  • PowerShell Core Preview v6.2.0-preview.4 (both Windows and Linux)

*Note: This is not the latest version  of Virtualbox but it’s still supported.

Remember, the purpose of this environment is to build a “developer sandbox” that can allow me to learn and work with Docker containers.

What’s needed!

Because I’m using VMware Workstation instead of Hyper-V, there are a few things need to be in place to make this work. Windows 10 need to have the following:

  • All Hyper-V services need to be disable by using “System Configuration” tool.

  •  Install VMWare Workstation Pro. (https://www.vmware.com/products/workstation-pro.html)
  •  Install Oracle Virtualbox version 5.2. (https://www.virtualbox.org/wiki/Download_Old_Builds_5_2)

  •  Install from the Microsoft Store, WSL – Ubuntu 18.04.

  • And, make sure to run “sudo apt update” and “sudo apt upgrade” because images are not updated with latest components.

Installing PowerShell Components

Next, the following Docker components packages from Chocolatey need to be install using Windows PowerShell with administrator privileges:

* Install docker

choco install -y docker

* Install docker-machine-vmwareworkstation

choco install -y docker-machine-vmwareworkstation

Getting WSL Ready for Docker

Now, open the “WSL – Ubuntu 18.04” Linux console and execute the following *commands:

sudo apt update

sudo apt upgrade

*Note: You’ll need to run these two commands manually to keep your Linux distribution up-to-date.

At this point, follow the Docker installation instructions for “Docker-CE for Ubuntu 18.04“. But, in a nutshell, here’s the shortcut:

sudo apt-get install \
apt-transport-https \
ca-certificates \
curl \
gnupg-agent \
software-properties-common

curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -

sudo add-apt-repository \
"deb [arch=amd64] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) \
stable"

sudo apt-get update

sudo apt install docker-ce

sudo usermod -aG docker maxt

exit

At this point. make sure to reopen the WSL linux console.

Setup Docker-Machine in Windows

Back in Windows PowerShell, the next steps show the way to have Docker work in “WSL – Ubuntu 18.04“. Starting with Windows PowerShell console, execute the following commands:

docker-machine --native-ssh create -d vmwareworkstation default
docker-machine create docker-host

These commands should complete without any errors. At the same time, two virtual machines: “default” and “docker-host” will be created and running in *Virtualbox.

*Note: These two *NEED* to be running in order for docker to work with WSL. At the same time, both VMware Workstation and Virtualbox need to be installed or this will not work

To check that for the Docker-Machine environment(s) are working, use the following command:

docker-machine ls

Next, execute the following command to write down “docker-host” environment results to be copied into the Linux user ~/.bashrc file.

docker-machine env docker-host
PS C:\WINDOWS\system32> docker-machine.exe env default
$Env:DOCKER_TLS_VERIFY = "1"
$Env:DOCKER_HOST = "tcp://192.168.220.xxx:2376"
$Env:DOCKER_CERT_PATH = "C:\Users\max_t\.docker\machine\machines\default"
$Env:DOCKER_MACHINE_NAME = "default"
$Env:COMPOSE_CONVERT_WINDOWS_PATHS = "true"
# Run this command to configure your shell:
# & "C:\ProgramData\chocolatey\lib\docker-machine\bin\docker-machine.exe" env default | Invoke-Expression

Open a “WSL – Ubuntu 18.04 console to edit the user “~/.bashrc” file, to add the following Docker variables:

## Added manually for Docker machine docker-host:
export DOCKER_HOST=192.168.99.xxx:2376
export DOCKER_TLS_VERIFY=1
export DOCKER_CERT_PATH=/mnt/c/users/max_t/.docker/machine/machines/docker-host
export DOCKER_MACHINE_NAME=docker-host
export COMPOSE_CONVERT_WINDOWS_PATHS=true

sudo vim ~/.bashrc

Reopen the “WSL – Ubuntu 18.04 console.

Testing Docker in WSL

Now, I can test Docker in my “WSL – Ubuntu 18.04 console session. Open PowerShell Core console, and execute the following command to run the Docker Hello-World demo:

docker run Hello-World

This command download (or pull) the Docker image, then run the Hello-World container. If everything work as expected, then it will display the following text.

To check both Docker image(s) and/or container(s) in WSL , use the following commands: (Picture

# - Check for all pulled images in system:
docker images

# - Check the status of active containers:
docker ps -a

As you can see there no issues executing Docker command lines in Linux PowerShell Core.

To see the full list of docker command line help available click on the following link.

After all this is done! Docker working in my WSL environment.

Limitations

YES! There are limitations. This is a workaround on the issue of using Docker without Hyper-V. And, this will allow you to:

  • Pull images
  • Update containers
  • Save images

In my environment, I found limitations working with Docker Network using WSL which can impact Windows Docker-Machine VM “docker-host” interface. This issue can force you to rebuild both VM interfaces: “default” and “docker-host“.

Make sure to learn how to commit, save, and reload Docker images.  Don’t lose your changes!

So, if you have either VMware Workstation and/or Oracle Virtualbox, consider investing the time creating a Linux virtual machine and then install Docker CE.

Summary

We have accomplished setting up Docker containers in *Windows 10 “WSL – Ubuntu 18.04” using both Windows PowerShell and PowerShell Core in Linux. So, using Oracle Virtualbox v5.2 with VMware Workstation is a required component to make this work.

*Note: These post is meant for people to make Docker work in WSL Linux.

Also, if you’re familiar with PowerShell, Docker commands can execute without any issues. Now, I can use my favorite editor SAPIEN’s PowerShell Studio to build my automation scripts with docker commands.

What’s Next?

Try downloading other Docker images, like SQL Server 2017 and SQL Server 2019. This is the quickest way for providing a built solution using containers.

Learn about Docker Compose, and Kubernetes as these can be use in the Cloud environment as well.

Go and Explores the possibilities of provisioning solutions to your organization!

Resource links

Powerhell Core Ubuntu 18.04 – PSRemoting to an Active Directory Machine

Sometime there’s the need to do PowerShell remoting from Linux to a Windows System. In my lab environment, I was able to install, configure, and established a PowerShell Remote connection from a Linux Ubuntu 18.04 system to *Active Directory joined Windows System.

*Note: Before trying to following steps, if you’re in a corporate domain, consult with your security team. I would recommend that you try this scenario in virtual machine environment.

I’ve been struggling trying to OpenSSH in both Windows 10 (Build 1803) and Windows Server 2019 with no success connecting from Linux. So, I decided to try install Kerberos component on my Ubuntu system and it works!  And, with no need to joined my Linux system to my virtual Active Directory domain.

Install and configuring Kerberos Client

  • I need to install and configure the Kerberos Client application on my system:
$ sudo apt-get install krb5-user
  • Customizing *krb5.conf file settings for my domain:
$ sudo vim /etc/krb5.conf
  • The following are my custom settings in the krb5.conf file for “DOMAINNAME” Kerberos:
[libdefaults]
default_realm = DOMAINNAME.COM

# The following are custom settings for "DOMAINNAME" Kerberos:
dns_lookup_realm = true
dns_lookup_kdc = true
default_tgs_enctypes = arcfour-hmac-md5 des-cbc-crc des-cbc-md5
default_tkt_enctypes = arcfour-hmac-md5 des-cbc-crc des-cbc-md5
permitted_enctypes = arcfour-hmac-md5 des-cbc-crc des-cbc-md5

[realms]
TRINITY.COM = {
kdc = DOMAINMACHINENAME
admin_server = DOMAINMACHINENAME
}

[domain_realm]
.com = DOMAINNAME

*Note: Make a copy of the krb5.conf file before any changes.

One thing to point out! Both DOMAINNAME and DOMAINMACHINENAME, must be in uppercase.

Configuring ssh

Next step involves in configuring the ssh for Kerberos negotiation. This is the ssh_config file (not sshd_config).

$ sudo vim /etc/ssh/ssh_config

Make sure the following parameters are set at the end of the *ssh_config file:

SendEnv LANG LC_*
HashKnownHosts yes
GSSAPIAuthentication yes
GSSAPIDelegateCredentials no
GSSAPIKeyExchange yes

*Note: If there are missing ones, don’t touch the commented ones. Just copy/paste and set the values.

After completing the changes, I would recommend a reboot.

Testing and working Kerberos Client

Here are a few linux commands to work with Kerberos client.  If the krb5.conf setting are set correctly, then the following commands should work without any issues.

1. This command will verify user domain, asking for password.

$ kinit username@domainname

2. Shows the list of Kerberos Cached tickets and credential.

$ klist

3. To delete\clear all Kerberos Cache entries:

$ kdestroy

What about setting in Windows Systems?

I’m will cover the whole PowerShell remoting setup. But, I will highlight what’s needed to make Linux connect to a Active Directory Domain system.

  • Enable PSRemoting

In PowerShell Conscole, run the “Enable-PSRemoting -force” command line on both client and server. This command will add the firewall rule to allow PowerShell remoting to work.

  • Check WinRM Service

Check the Windows Remote Management service is running. By default, in Windows 10 client, this is set to “Manual”.
On the server, just verify that the service running.

Before, connecting Linux to a windows domain system, make sure to test PowerShell remoting between Windows machines. This will guarantee that you got everything working correctly.

Name Resolution Tip

I don’t join my Linux system to my AD domain. So, to resolve my name resolution issues, I manually update the hosts file on my systems. This will include the domain ip-address as well as all other systems

hosts file
:
xxx.xxx.xxx.xxx domainname.com
:

Testing connectivity

Ubuntu 18.04 Connecting to a domain system final test.

1. In Linux, open PowerShell:

$ pwsh

2. Prepare the domain user:

PS /home/user> kinit domainuser

3. Create a *PowerShell Remote interactive session:

PS /home/user> Enter-PSSession -ComputerName wincomputer -Authentication Negotiate -Credential user@domainname.com

*Note: This remote connection will open Windows PowerShell and not PowerShell Core.

Summary

So, in Ubuntu 18.04 installing and configuring Kerberos user client only, you can connect your Linus system to a Active Directory Domain systems. But remember, this will connect to a Windows PowerShell session only.

I’m hoping that in the near future we can have the ability to select a PowerShell versions. Wait!!  There’s a way to open a PowerShell Core session instead of Windows PowerShell!!

How To Connect to PowerShell Core

So, by default you’re going to connect to Windows PowerShell. But, if you use the following parameter ‘-ConfigurationName’ folllowed by either ‘PowerShell.6‘ or ‘PowerShell.6-Preview‘ then you’ll get PowerShell Core session.  Also, you can use an specific version ‘PowerShell.6.1.0‘.

Enter-PSSession -ComputerName venus -Authentication Negotiate -Credential max_t@trinity.com -ConfigurationName PowerShell.6

Thanks to Steve Lee (Microsoft PowerShell Team) for letting me know this is already available.

References

The following links help figured out the needed components to make my lab environment work.

Getting the latest Tools for PowerShell SQL Server Automation

You all know how important is to have the tool that can make our life easy do our system administration, and become a hero in our organization. Here’s a startup helper guide to get you going with some PowerShell and SQL Server tools.

What is available for automation!

For script automation we could install either or both version of PowerShell Core: (As of February 19th, 2019)

Here are some important PowerShell Modules to use for SQL Server management scripting:

  • *SQLServer – This module currently can be use on SQL Server 2017 and greater.
  • *DBATools – This a community supported module that will work with SQL Server 2000 and greater.
  • DBAReports – Supports for Windows SQL Server.
  • DBCheck – Support for Windows SQL Server.

*Note: This module is coming popular in cross-platform systems (non-Windows)

All of the above module can be downloaded from the PowerShell Gallery from the PowerShell console using the Install-Module cmdlet.

Install-Module -Name SQLServer -Force -AllowClobber;

Now, when working with older versions of SQL Server (2008->2017), you will find the SQLPS module is loaded during the SQL Server installation.

Just remember, since SQL Server 2017, Microsoft has change the PowerShell SQLPS module to SQLServer module downloadable from the PowerShell Gallery. This module is not available in PowerShell Gallery, only available during the SQL Server installation.

When PowerShell SQL Server Module can’t provide a script?

It won’t hurt to install the SQL Server Management Objects (SMO) library in case you want to be creative and start building your own SQL PowerShell scripts. This library is already available cross-platform, meaning that it will work in Windows, Linux and MacOS environments.

In this case, you can install the SQL Server SMO library “Microsoft.SqlServer.SqlManagementObjects” from the PowerShell Console using the Install-Package cmdlet.

Install-Package -Name Microsoft.SqlServer.SqlManagementObjects -AllowPrereleaseVersions;

Wait! There is more

As you already know, to manage SQL Server in Windows environment, we use the SQL Server Management Studio. But, this
application won’t work cross-platform.

So, the cross-platform option available is Azure Data Studio (February edition):

Don’t forget to include for following extensions:

What about Python?

By now you should already know that Python has been around for many year as cross-platform interpreted object-oriented high-level language. And, its popularity keeps increasing.

I would recommend to take a look at the Anaconda Distribution, and specifically the one with the latest version of Python (v3.7).

Download Anaconda for data science platform:

This installation will include *All* Python packages available to build an application.

And, Python can interact with PowerShell too!

Ah finally Containers!

Yes! Containers has become popular and can’t be ignored. It can be use in both Windows, Linux and any cloud environments. Go ahead to learn how to work and manage Docker containers.

Docker site to Download the Docker CE.

Don’t forget to check Docker Hub to find the latest Docker Container images available for download. And, you will need to create an account before downloading images.  The image below shows how-to search for the SQL Server image.

In Summary

As technology will keep improving, make sure stay up-to-date. This give us the opportunity to improve our job position and be of value for the organization that hire us.

Don’t forget to look for the nearest technology event in your areas, as this is the opportunity to learn for free and gain invaluable knowledge.

PowerShell Core Ubuntu 18.04 Using Docker Containers

Containers have been popular for some time now in the industry. It’s becoming as important as learning PowerShell.  Have you try it?

Installing Docker

While learning about Docker Container, I notice that is much easier to installed on a Linux system. In Windows, Hyper-V is a requirement to install Docker, and specially if you want to use the “Windows Subsystem in Linux” WSL feature, there’s more setup to complete. So, I’m not using Hyper-V, I’m using VMware Workstation.  To keeping simple, I created an Ubuntu 18.04 VM using VMWare Workstation.

You can find the Docker CE installation instructions in the following link.

If you’re using Ubuntu 18.04. make sure to install Curl, as it isn’t included in the OS.

$ sudo apt install curl

After the Docker CE installation completes, make sure to run the “hello-world” to test the container runs.

If  you have Hyper-V and you’re interested in pursuing making Docker CE works in Windows 10 WSL (Windows Subsystem for Linux), check the following link on “running-docker-on-bash-on-windows“.

Use PowerShell Core

Make sure you have the latest PowerShell Core installed. If it was previously installed, then use the “$ sudo apt upgrade” to get the latest version. Of course, this all depends on which installation method you previously chose to install PowerShell

Or, just go to the Github PowerShell page to get the latest version(s) for either GA (Generally Available) or the Preview.

Now, before starting to working with Docker as a non-root user, you will need to add the current user to the “Docker” group, by using the following command line:

$ sudo usermod -aG docker your-userid

Then, open a PowerShell Core session to start working with Docker containers.

The following docker commands are the essentials to work with containers and can be executed from within the PowerShell Core session:

1. Listing active container

PS /> docker ps -a

2. Listing images in your system

PS /> docker images

3. Get an container image

PS /> docker pull image-name (not GUID)

4. Image(s) cleanup

PS /> docker rmi image-GUID

If you want to find a particular image, then go to Docker Hub site. An account is required in order to login to the Docker Hub.  In there you’ll find certified images that can download using the “docker pull <imagename>” command line.

Docker PowerShell Containers

Interesting enough, while doing a search in Docker Hub, I found ta couple of PowerShell Core containers available that caught my attention.

1. “PowerShell” (Verified Publisher).

2. “azuresdk/azure-powershell-core”.

Also, “PowerShell-Preview” is available under the PowerShell Core container page

To try them out, just execute the following command:

1. PowerShell – Read documentation. Then execute the following command line to pull the container to your system.

PS /> docker pull mcr.microsoft.com/powershell

2. Azure-PowerShell-Core – As of now, current documentation is out-dated. (

PS /> docker pull azuresdk/azure-powershell-core

3. PowerShell-Preview

PS /> docker pull mcr.microsoft.com/powershell:preview

After completing pulling the docker images, then we are ready to check the containers by executing the following command:

1. Execute PowerShell Core GA

docker run -it microsoft/powershell

2. Execute *Azure-PowerShell

docker run -it azuresdk/azure-powershell-core

3. Execute PowerShell Core Preview (latest)

docker run -it microsoft/powershell:preview

Note: It seems the Azure-PowerShell container is not currently up-to-date for both PowerShell Core GA and the Azure modules version. Just check for upcoming updates on these repositories later.

Summary

As you can see, working with Docker containers and PowerShell makes it a convenient test environment. Either way, if you’re want to use Linux terminal session with or without PowerShell, there’s no reason why not to try it.  So, after you have created the linux VM and complete the installation of Docker CE, you’ll get it working in minutes.

 

Useful PowerShell Azure Connect CLI Options with Az Module Version 1.0

Recently, Microsoft Azure team release the new version of the AzureRM Module can be install in both Windows PowerShell and PowerShell Core. Now, with the new version renamed from AzureRM to ‘Az‘, Microsoft is encouraging everyone to download and start using this refreshed module moving forward. Just make sure to keep reporting any issues on this module.

Don’t forget to check out the recent blog about the Azure PowerShell ‘Az’ Module version 1.0.

One of the most important fix was the issue experiencing with the TokenCache Initialization when using the cmdlet Import-AzContext. This issue was causing many DevOps to go back and use older version of the AzureRM module to get their script to work again.

The “TokenCache initialization” issue was reported in Github for some time, and finally it has been fixed.

Now, let’s take a look on how to connect to Azure.

Azure Connection CLI options

The following cmdlets can assist you with Azure connectivity:

  • Connect-AzAccount
  • Save-AzContext
  • Import-AzContext
  • Enable-AzContextAutoSave
  • Disable- AzContextAutoSave

All of these cmdlets belongs to the “Az.Account” module which is included in the Az Module. Here’s where you can use find different ways to connect to Azure.

Below is the full list of all Az modules installed from the PowerShell Gallery:

PS [118] > Get-Module -ListAvailable Az.* | Select Name, Version

Name Version
---- -------
Az.Accounts 1.0.0
Az.Aks 1.0.0
Az.AnalysisServices 1.0.0
Az.ApiManagement 1.0.0
Az.ApplicationInsights 1.0.0
Az.Automation 1.0.0
Az.Batch 1.0.0
Az.Billing 1.0.0
Az.Cdn 1.0.0
Az.CognitiveServices 1.0.0
Az.Compute 1.0.0
Az.ContainerInstance 1.0.0
Az.ContainerRegistry 1.0.0
Az.DataFactory 1.0.0
Az.DataLakeAnalytics 1.0.0
Az.DataLakeStore 1.0.0
Az.DevTestLabs 1.0.0
Az.Dns 1.0.0
Az.EventGrid 1.0.0
Az.EventHub 1.0.0
Az.HDInsight 1.0.0
Az.IotHub 1.0.0
Az.KeyVault 1.0.0
Az.LogicApp 1.0.0
Az.MachineLearning 1.0.0
Az.MarketplaceOrdering 1.0.0
Az.Media 1.0.0
Az.Monitor 1.0.0
Az.Network 1.0.0
Az.NotificationHubs 1.0.0
Az.OperationalInsights 1.0.0
Az.PolicyInsights 1.0.0
Az.PowerBIEmbedded 1.0.0
Az.RecoveryServices 1.0.0
Az.RedisCache 1.0.0
Az.Relay 1.0.0
Az.Resources 1.0.0
Az.ServiceBus 1.0.0
Az.ServiceFabric 1.0.0
Az.SignalR 1.0.0
Az.Sql 1.0.0
Az.Storage 1.0.0
Az.StreamAnalytics 1.0.0
Az.TrafficManager 1.0.0
Az.Websites 1.0.0

PS [169] >

There’s a total of 45 ‘Az‘ modules are installed in your system.

Simple Connection

First time connecting to your Azure subscription, you’ll use the “Connect-AzAccount” cmdlet. This cmdlet will asked you to open a browser, then copy/paste a URL and a code that will authorized your device to connect to azure.

So, after the device has been authorized, you can start start working with Azure commands from any of the PowerShell Consoles.

Now, using this cmdlet will create the “.Azure” folder under the user profile directory containing the following files:

c:\Users\username\.Azure
AzInstallationChecks.json
AzurePSDataCollectionProfile.json
AzureRmContext.json
AzureRmContextSettings.json
TokenCache.dat

But, using the Connect-AzAccount cmdlet, the connection only last while your PowerShell session is active. So, as soon as you close the console, and you’ll have to repeat the whole process to connect to Azure again.

Reusing Azure Authentication

We can reuse Azure Authentication in order to avoid the steps of re-opening the browser. Using the following cmdlets: Save-AzContext and Import-AzContext. These two cmdlets are very useful, as the Import-AzContext loads Azure authentication information from a JSON previously saved when using the Save-AzContext cmdlet.

So, as soon as initially set the connection with the Connect-AzAccount and setup all the necessary configuration (resource groups, storage accounts, Availability Sets…), proceed to use the Save-AzContext to save the Azure Authentication information to a folder location of your choosing.

Then, after exiting your PowerShell session, in order to reconnect again, just use the Import-AzContext cmdlet. I’m sure this one of the way many Azure DevOps are using the Import-AzContext to automate script in PowerShell.

Just make sure to occasionally refresh the file as you’ll be making changes the Azure account(s) by adding/removing resources.

Staying Connected to Azure Account

Now, this has been available for awhile using the cmdlets: Enable-AzContextAutoSave, and Disable-AzContextAutoSave. So, if you want to stay connected to your Azure Account, every time you close and reopen a PowerShell Console, you can start working with Azure cmdlets without any delays.

And, because you probably have been working with the Azure modules for some time, just execute the Enable-AzContextAutoSave cmdlet.

That’s it!

Now, you can close and re-open PowerShell Console and immediately start working with Azure.  This will make you feel like you are using the Cloud Shell on your desktop.

Just remember, in order to disable this functionality, just run the Disable-AzContextAutoSave cmdlet and exit the PowerShell Console.

Summary

As you can see, there are many ways to get connected from any system cross-platform to your Azure Account(s) to suit your need. This will work from anywhere from any system!

Just go ahead, experiment and find your own way to work proactive with Azure.