Azure from the Linux command line (part 1)

Hi!

Since i’ve been gowing up IT-wise with a unix command shell, I tend to do a lot of things with it. Also managing my Azure deployments since there’s the great Azure command line interface or cross platform (“xplat”) CLI.

(If you’re interested in the details, this is all open source, released under an Apache license, and on github:  https://github.com/WindowsAzure/azure-sdk-tools-xplat.)

This blog post documents a few tricks I’ve been using to get up and running fast.

First: You need to connect the xplat cli to your azure subscription. To do that simply run

$ azure download

after installing the cli. If you’re on a remote machine via ssh, this will simply give you an URL to launch in your browser. Make sure you’re already logged into the azure portal, otherwise you will need to login first when going to this URL.

The website will now give you a .publishsettings file for download. The same file is used when setting up a connection between Visual Studio and an Azure subscription.

Now get this file to your linux box (and make sure you keep it safe in transit, this file contains a management certificate key that can manage your subscription!) and import it into xplat cli:

$ azure account import <publishsettingsfile>

And now you’re all set.

Now let’s look around

$ azure help

info:    Executing command help
info:             _    _____   _ ___ ___
info:            /_  |_  / | | | _ __|
info:      _ ___/ _ __/ /| |_| |   / _|___ _ _
info:    (___  /_/ _/___|___/|_|____| _____)
info:       (_______ _ _)         _ ______ _)_ _
info:              (______________ _ )   (___ _ _)
info:
info:    Windows Azure: Microsoft’s Cloud Platform
info:
info:    Tool version 0.7.4
help:
help:    Display help for a given command
help:      help [options] [command]
help:
help:    Open the portal in a browser
help:      portal [options]
help:
help:    Commands:
help:      account        Commands to manage your account information and publish settings
help:      config         Commands to manage your local settings
help:      hdinsight      Commands to manage your HDInsight accounts
help:      mobile         Commands to manage your Mobile Services
help:      network        Commands to manage your Networks
help:      sb             Commands to manage your Service Bus configuration
help:      service        Commands to manage your Cloud Services
help:      site           Commands to manage your Web Sites
help:      sql            Commands to manage your SQL Server accounts
help:      storage        Commands to manage your Storage objects
help:      vm             Commands to manage your Virtual Machines
help:
help:    Options:
help:      -h, –help     output usage information
help:      -v, –version  output the application version

That does not look to bad after all. Just remember azure help <command>,this is your first stop whenever you get stuck.

So let’s set up a linux VM. First let’s check what pre-configured linux images are available.

$ azure vm image list

Now you should see a lot of images. When I just ran this, I got more that 200 lines of output. Image names look like this:

 b39f27a8b8c64d52b05eac6a62ebad85__Ubuntu-13_10-amd64-server-20140226-en-us-30GB

Now we could copy this name to our clipboard and paste it into the next command, but let’s have the shell do that for us, here’s the idea:

#!/bin/bash
IMAGENAME=`azure vm image list |grep -i Ubuntu-13_10-amd64-server |tail -1 | awk ‘{print $2}’`

Get the list of VM images, just select the ones we’re interested in, then select the last (i.e. the most recent one) of that list and just give me the second string which is the image name. Easy, right? Note the back single quotes in the beginning and the end of that line, this is shell syntax for “take the output of that command and store it in that shell environment variable”.

To use the VM, we need to login, so let’s use a password for now:

PASSWORD=”AtotallySECRET!PA55W0RD”
echo Password is $PASSWORD

Next, let’s create the VM:

azure vm create -e -z extrasmall -l “West Europe” $1 $IMAGENAME azureuser “$PASSWORD”

Here’s the output of running this shell script:

$ bash create_ubuntu_vm.sh contosolinux

Password is AtotallySECRET!PA55W0RD
info:    Executing command vm create
+ Looking up image b39f27a8b8c64d52b05eac6a62ebad85__Ubuntu-13_10-amd64-server-20140226-en-us-30GB
+ Looking up cloud service
+ Creating cloud service
+ Retrieving storage accounts
+ Creating VM
info:    vm create command OK

And after about two minutes I can ssh into contosolinux.cloudapp.net as “azureuser” with that super secret password.

Hope it helps,

H.

ps: to get rid of the VM again, I just type azure vm delete -b contosolinux

pps: in case that’s too harsh, azure vm shutdown contosolinux, azure vm start contosolinux and azure vm restart contosolinux work as well. And azure vm list shows you what Azure thinks your VMs are doing right now.

ppps: And in case you were wondering why there was no root password set: just run sudo bash from this initial user account.

 


Source: msdn

Posted in Microsoft | Leave a comment

Setting up a Linux FTP server on Windows Azure

Hi!

This post is about hosting FTP in a Linux VM on Windows Azure. And here’s a Spoiler Alert: The catch is that you may need to set the TCP keepalive timeout in the Linux Kernel to support very long FTP transfer times through the Azure load balancer. But I’ll get to that.

A few weeks ago, a customer needed to run their FTP server on Windows Azure. Being familiar with Linux and having a pretty complex proftpd configuration, the customer decided to keep this all on Linux.

So let’s recall again what’s so special about FTP:

  • FTP uses two connections, a control connection that you use for sending commands to a server and a data connection that gets set up whenever there is data to be transferred.
  • FTP has two ways to set up such a data connection: active and passive. In passive mode, the client opens a second connection to the same server but on a different port. In active mode, the client creates a listening port, then server opens a connection to this port on the client.
  • And in the world of client systems behind firewalls and NAT devices, the active mode inevitably fails since hardly any client is still accessible from the public internet and can just open a listening port that is reachable from the public internet.
  • Lucky enough, most off-the-shelf FTP clients including the ones in web browsers default to passive mode.
  • There are some funny things you can do with FTP, e.g. FXP, where one FTP server in active mode directly transfers to another ftp server in passive mode.

And recall what’s special about Windows Azure networking:

  • Every connection from the outside to an Azure VM goes through a cloud service endpoint. There are no “exposed hosts”.

So in order to have the “passive” data connections reach their destination, one has to configure a bunch of endpoints in the Azure configuration and then tell the FTP server to use these endpoints for incoming data connections. One could configure each of those endpoints manually through the Windows Azure portal, but that’s time-consuming and error-prone. So let’s use a script to do that… (I’m using the Linux command line tools from http://www.windowsazure.com/en-us/downloads/ )

$ azure vm endpint create contosoftp 21

$ for ((i=20000;i<20020;i++)); do azure vm endpoint create contosoftp $i; done

This creates 20 endpoints for the FTP data connections and the usual port 21 endpoint for the control connection.

Now we need to tell proftpd (or any other FTP daemon of your choice) to use exactly this port range when opening data connection listening sockets.

in /etc/proftpd/proftpd.conf

PassivePorts 20000 20019

As you may know, Windows Azure VMs use local IP addresses that are non public. In order to tell the client what IP address to talk to when opening the data connection, the FTP server needs to know its external, public IP address, i.e., the address of its endpoint. Proftpd has all the required functionality, it just needs to be enabled via the MasqueradeAddress directive

MasqueradeAddress contosoftp.cloudapp.net

And that’s it.

Now the customer used this configuration, but once in a while, a customer reported that a very long-running FTP transfer would not go through but break because of a “closed control connection”.

After thinking a bit, we thought this is a side effect of the Windows Azure load balancer that is managing the endpoints. When the load balancer does not see traffic for a while (at the earliest after about 60 seconds) it may “forget about” an established tcp connection. In our case, the control connection of the ongoing data transfer was idle while the data connection was happily pumping data.

Lucky enough, there’s a unix socket option called “TCP Keepalive” which will make idle but open connections send a few control packets to inform everything on the network that this connection is still in use. And proftpd (from version 1.3.5rc1 on) supports a “SocketOptions keepalive on” directive to enable this behavior on its connections. Great!

But even enabling this didn’t solve the issue, since there is a default in the Linux kernel for when these keepalive packets are first sent:

$ cat /proc/sys/net/ipv4/tcp_keepalive_time

7200

OK, that’s 7200 seconds which is two hours. That’s a bit long for our load balancer.

# sysctl -w net.ipv4.tcp_keepalive_time=60

That’s better. But remember this is a runtime setting in the Linux kernel, so in order for it to survive reboot, put it into a convenient place in /etc/rc*.d/

Hope this helps,

H.

 

 

 

 

 

   


Source: msdn

Posted in Microsoft | Leave a comment

Hello, World!

~# apt-get install hello

~# hello -n

??=??????????????
? Hello, World! ?
???=?????????????

About two months ago, I switched jobs. In my last job, I worked as an applied researcher and software development engineer at ATL Europe, an applied Microsoft lab that is part of Microsoft Research.

So here I am, working for Developer and Platform evangelism at Microsoft Germany. I’ll focus on a couple of things that I’ve been dealing with in the past, these are

  • Windows Azure
  • Open Source Software, especially Linux on Azure and
  • the “Internet of Things”.

I’ll record my findings in this blog, both from my own experiences and from my work with partners. I’ll blog whenever I learn something that I think will help others.

But please remember: I’m writing these posts at a particular point in time. Hardware, Software, Services and Devices all evolve over time and what may be true at the time of writing may be different at the time you read this.

Best,

H.

ps: you can find my old personal blog and some info about myself on cubeos.org

 


Source: msdn

Posted in Microsoft | Leave a comment

Setting up VPN with Surface and Home Server 2011

Happy new year 2013! It seems I am posting in ever-increasing intervals, so the next blog post should be coming in January 2017 if the pattern continues.

Anyway, I received my Surface just before Christmas, and it’s an amazing device. But now let’s stop the advertising for a moment and cut to the chase.

One thing I am missing from my Surface is the SSH client. OK, there are some nice Metro  New Windows 8 Style apps implementing SSH and they work really well, but what I am missing is port forwarding. What I often do is ssh into my home unix box and use port forwarding of port 3389 to RDP into one of my Windows machines. On Windows 8, I use Putty which just works. But then there are no Desktop apps on Windows RT and the Metro apps do not forward ports.

So I decided to set up VPN to my Home Server 2011. There is a nice tutorial by Chris Barnes at http://thedigitalmediazone.com/2012/03/26/how-to-set-up-vpn-for-windows-home-server-2011/ 

On Surface, it is easy to set up the VPN client but you need to get to the Network and Sharing center first.

Press the Windows key, the type “Network and sharing” (“Netzwerk und Freigabe” in german) and then touch the “Settings” icon. On the left, the “Network and sharing center” icon shows up. Touch the icon. The desktop opens with the Network and Sharing center window. Then continue as described in the tutorial.

One little thing: When I first set this up, I ended up with an Error 500 and no connection. This happened because I used the dyndns name of my router as server address. VPN is tunneled over SSL and of course the SSL certificate of the server did not match the dyndns name. But when you set up remote web access to the home server, it also creates a dynamic name service automatically on the domain yourservername.homeserver.com, and this one is the name used in the SSL certificate. If you use that name, then VPN is happy.

After having set up the connection, the desktop is not needed anymore. If you go to the settings charm and hit the network icon, there is a new VPN connection. Just touch the connection icon and press connect, then enter your credentials and you’re connected.

Have fun…

H.

Posted in Uncategorized | Leave a comment

Microsoft fixing your PC…

Now I’m working for this company for about three years and I still discover new things I wish I would have found a long time ago. If you’re the typical PC end user that is frustrated by your PCs behavior once again, there is a one-stop shop for most of your troubles: http://support.microsoft.com/fixit/ (if you can’t remember this, http://www.microsoft.com/fixit/ also works.)

Just for fun: go to http://support.microsoft.com/fixit/ and click on the “top solutions” button in the upper left, then scroll down. There things like Fix power consumption problems and extend the laptop battery life or Fix Windows system performance problems on slow Windows computers and even the classic “windows won’t print”: Printing problems and printing errors

Posted in Computers, Microsoft | Leave a comment

HeadSLAM buzz

We received a bit of publicity around our HeadSLAM presentation at Pervasive and ISWC.

There’s a good short article on the new scientist website here. Collin Barras, the author of that article had a few extra questions on our work and we had a really good e-mail conversation on the subject. He also asked me if we had any video material, but he chose not to use it.

And there is a second Article on Engadget here. As the comments on the article at Engadget correctly point out, there is nothing to see in the video but Burcu walking down the corridor that we mapped. And as the author omitted the resulting map in his article, it is hard to understand our research with the article alone. As some of the commenters may not have realized that there is a link to the preprint of the article when clicking the picture, I added the resulting map here.

corr-open-rooms

So here is one of the map examples produced from the data we recorded. And to limit possible misunderstandings: No this is not a product, not even a prototype, it’s a scientific experiment only. And the video is for illustration of the recording process only, this is neither an advertisement nor a “cool” demo video, it’s there to show how we recorded the data. We are perfectly aware of the fact that this needs hardening to be used by firefighters, that the infrared LIDAR is of no much use in real smoke, that there is no smoke in the video (would have been a bit pointless to show a video recorded in thick black smoke, right?) and that using a normal notebook computer is not really wearable. This “just” shows that the principle works and that it could also work with smoke-penetrating sensors such as Radar or Sonar.

We will continue to add some material to http://www.cubeos.org/headslam/

And apart from that, I would have appreciated if Engadged would have contacted us before using our video on their site.

Holger

Posted in Research, Wearable Computing | Leave a comment

ISWC 2008

I am currently at ISWC. The conference is a bit smaller this year, but the quality of the papers is very good. After two excelent keynotes by Raj Reddy and Marcel Just, the conference program started yesterday morning

Burcu Cinaz presented our paper on HeadSLAM and she did well, especially considering that  it was her first presentation on a major conference and it was the first technical presentation of the conference.

Today, there was another inspiring keynote by Rory Cooper and then, there were more technical presentations. Now, just before lunch, the panel discussion is taking place.

Yesterday, it was announced that the next ISWC in Linz/Austria is taking place at the same time as the ars electronica festival. This could be a very interesting collaboration.

H.

Posted in Conferences, Research, Wearable Computing | Leave a comment

Event update

Meanwhile, the third event I am co-organizing is about to close its submission, my workshop at the “Mensch und Computer” conference in Lübeck on September 8th, 2008.

Posted in Computer Science, Conferences, Wearable Computing | Leave a comment

Upcoming scientific events

There are a couple of scientific events coming up that I am involved in:

CUIPAM08: This is a workshop I am co-organizing together with Prof. Michael Lawo from Uni Bremen. It’s about Context-awareness and User Interfaces for Professional Applications of Mobile and Wearable Computing and we’re trying to join researchers to answer the somewhat provoking question if there is any benefit in context awareness or if there’s just an increased confusion of the user.

IFAWC2008: The call for papers is not out yet, but the international forum for applied wearable computing will be co-located with ISWC this year in Pittsburgh.

More to come…

H.

Posted in Uncategorized | Leave a comment

Microsoft EMIC

Beginning October 1st 2007, I will be working at Microsoft EMIC in Aachen, Germany. I will be mainly working on embedded systems. I will continue to maintain CubeOS.

H.

Posted in Research, Work | Leave a comment